日志配置文件选择使用log4j.properties flink程序不打印日志。
问题原因
日志依赖包冲突
解决办法
将lib目录下的log4j2依赖移除,如下:
log4j-1.2-api-2.12.1.jar
log4j-api-2.12.1.jar
log4j-core-2.12.1.jar
log4j-slf4j-impl-2.12.1.jar
log4j.properties
# This affects logging for both user code and Flink
log4j.rootLogger=INFO, infoFile# The following lines keep the log level of common libraries/connectors on
# log level INFO. The root logger does not override this. You have to manually
# change the log levels here.
log4j.logger.akka=INFO
log4j.logger.org.apache.kafka=INFO
log4j.logger.org.apache.hadoop=INFO
log4j.logger.org.apache.zookeeper=INFO# Log all infos in the given file
log4j.appender.infoFile=org.apache.log4j.RollingFileAppender
log4j.appender.infoFile.File=${log.file}
log4j.appender.infoFile.layout=org.apache.log4j.PatternLayout
log4j.appender.infoFile.layout.ConversionPattern=%d{yyyy/MM/dd HH:mm:ss,SSS} %p %C.%M(%L) | %m%n
log4j.appender.infoFile.append=true
log4j.appender.infoFile.MaxFileSize=32MB
log4j.appender.infoFile.MaxBackupIndex=128# Suppress the irrelevant (wrong) warnings from the Netty channel handler
log4j.logger.org.apache.flink.shaded.akka.org.jboss.netty.channel.DefaultChannelPipeline=ERROR, file文章来源:https://www.toymoban.com/news/detail-707352.html# We only log the kafka appender logs to File to avoid deadlocks
log4j.logger.cloudera.shaded.org.apache.kafka=INFO, file
log4j.additivity.cloudera.shaded.org.apache.kafka=false文章来源地址https://www.toymoban.com/news/detail-707352.html
到了这里,关于Flink使用log4j.properties不打印日志问题的文章就介绍完了。如果您还想了解更多内容,请在右上角搜索TOY模板网以前的文章或继续浏览下面的相关文章,希望大家以后多多支持TOY模板网!