英文:
Getting exception while launching yarn jobs due to kafka log4jappender
问题
当我使用这个log4j.properties作为配置来运行我的yarn作业时,它会失败,并出现以下异常。如果我从rootLogger中移除KAFKA,作业就会正常启动。
这是与此处报告的相同问题:
https://github.com/wso2/product-ei/issues/2786
但我尚未找到解决方法。
环境:CDH 6.3.3
这是我的log4j.properties文件。
log4j.rootLogger=DEBUG,stdout,KAFKA
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Threshold=DEBUG
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%-5p %L %X{taskId} %X{stsId} %d{yyyy-MM-dd HH:mm:ss} %c %t %m%n
log4j.appender.alog=org.apache.log4j.RollingFileAppender
log4j.appender.alog.maxFileSize=10MB
log4j.appender.alog.maxBackupIndex=5
log4j.appender.alog.file=../logs/serverx.log
log4j.appender.alog.append=false
log4j.appender.alog.layout=org.apache.log4j.PatternLayout
log4j.appender.alog.layout.conversionPattern=%-5p %X{taskId}
%X{stsId} %d{yyyy-MM-dd HH:mm:ss} %c %t %m%n
log4j.appender.KAFKA=org.apache.kafka.log4jappender.KafkaLog4jAppender
log4j.appender.KAFKA.layout=org.apache.log4j.EnhancedPatternLayout
log4j.appender.KAFKA.layout.conversionPattern=%d%C{1}%t%5p %-4p%X{taskId} %X{stsId} %d{yyyy-MM-dd HH:mm:ss} %c %t %m%n%throwable
log4j.appender.KAFKA.topic=cdhuser_rocplus_roclog
log4j.appender.KAFKA.securityProtocol=PLAINTEXT
log4j.appender.KAFKA.ignoreExceptions=false
异常信息:
在版本合法性检查期间发生了意外问题
报告的异常:
java.lang.NullPointerException
at org.slf4j.LoggerFactory.versionSanityCheck(LoggerFactory.java:267)
at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:126)
at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:412)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:357)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:383)
at org.apache.kafka.clients.CommonClientConfigs.<clinit>(CommonClientConfigs.java:32)
at org.apache.kafka.clients.producer.ProducerConfig.<clinit>(ProducerConfig.java:333)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:327)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:299)
at org.apache.kafka.log4jappender.KafkaLog4jAppender.getKafkaProducer(KafkaLog4jAppender.java:279)
at org.apache.kafka.log4jappender.KafkaLog4jAppender.activateOptions(KafkaLog4jAppender.java:273)
at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
at org.apache.spark.internal.Logging$.org$apache$spark$internal$Logging$$isLog4j12(Logging.scala:217)
at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:122)
at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:111)
at org.apache.spark.deploy.yarn.ApplicationMaster$.initializeLogIfNecessary(ApplicationMaster.scala:771)
at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:102)
at org.apache.spark.deploy.yarn.ApplicationMaster$.initializeLogIfNecessary(ApplicationMaster.scala:771)
at org.apache.spark.internal.Logging$class.log(Logging.scala:49)
at org.apache.spark.deploy.yarn.ApplicationMaster$.log(ApplicationMaster.scala:771)
at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:786)
at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
Exception in thread "main" java.lang.ExceptionInInitializerError
at org.apache.kafka.clients.producer.ProducerConfig.<clinit>(ProducerConfig.java:333)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:327)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:299)
at org.apache.kafka.log4jappender.KafkaLog4jAppender.getKafkaProducer(KafkaLog4jAppender.java:279)
at org.apache.kafka.log4jappender.KafkaLog4jAppender.activateOptions(KafkaLog4jAppender.java:273)
at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
at org.slf4j.impl.Static
<details>
<summary>英文:</summary>
When I run my yarn jobs with this log4j.properties as a configuration it fails with the below exception. If I remove KAFKA from *rootLogger* the jobs launch fine.
This is the same issue that has been reported here:
https://github.com/wso2/product-ei/issues/2786
But I have not found a solution for this.
Environment: CDH 6.3.3
This is my log4j.properties file.
log4j.rootLogger=DEBUG,stdout,KAFKA
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Threshold=DEBUG
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%-5p %L %X{taskId} %X{stsId} %d{yyyy-MM-dd HH:mm:ss} %c %t %m%n
log4j.appender.alog=org.apache.log4j.RollingFileAppender
log4j.appender.alog.maxFileSize=10MB
log4j.appender.alog.maxBackupIndex=5
log4j.appender.alog.file=../logs/serverx.log
log4j.appender.alog.append=false
log4j.appender.alog.layout=org.apache.log4j.PatternLayout
log4j.appender.alog.layout.conversionPattern=%-5p %X{taskId}
%X{stsId} %d{yyyy-MM-dd HH:mm:ss} %c %t %m%n
log4j.appender.KAFKA=org.apache.kafka.log4jappender.KafkaLog4jAppender
log4j.appender.KAFKA.layout=org.apache.log4j.EnhancedPatternLayout
log4j.appender.KAFKA.layout.conversionPattern=%d%C{1}%t%5p %-4p%X{taskId} %X{stsId} %d{yyyy-MM-dd HH:mm:ss} %c %t %m%n%throwable
log4j.appender.KAFKA.topic=cdhuser_rocplus_roclog
log4j.appender.KAFKA.securityProtocol=PLAINTEXT
log4j.appender.KAFKA.ignoreExceptions=false
Exception:
Unexpected problem occured during version sanity check
Reported exception:
java.lang.NullPointerException
at org.slf4j.LoggerFactory.versionSanityCheck(LoggerFactory.java:267)
at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:126)
at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:412)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:357)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:383)
at org.apache.kafka.clients.CommonClientConfigs.<clinit>(CommonClientConfigs.java:32)
at org.apache.kafka.clients.producer.ProducerConfig.<clinit>(ProducerConfig.java:333)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:327)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:299)
at org.apache.kafka.log4jappender.KafkaLog4jAppender.getKafkaProducer(KafkaLog4jAppender.java:279)
at org.apache.kafka.log4jappender.KafkaLog4jAppender.activateOptions(KafkaLog4jAppender.java:273)
at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
at org.apache.spark.internal.Logging$.org$apache$spark$internal$Logging$$isLog4j12(Logging.scala:217)
at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:122)
at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:111)
at org.apache.spark.deploy.yarn.ApplicationMaster$.initializeLogIfNecessary(ApplicationMaster.scala:771)
at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:102)
at org.apache.spark.deploy.yarn.ApplicationMaster$.initializeLogIfNecessary(ApplicationMaster.scala:771)
at org.apache.spark.internal.Logging$class.log(Logging.scala:49)
at org.apache.spark.deploy.yarn.ApplicationMaster$.log(ApplicationMaster.scala:771)
at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:786)
at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
Exception in thread "main" java.lang.ExceptionInInitializerError
at org.apache.kafka.clients.producer.ProducerConfig.<clinit>(ProducerConfig.java:333)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:327)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:299)
at org.apache.kafka.log4jappender.KafkaLog4jAppender.getKafkaProducer(KafkaLog4jAppender.java:279)
at org.apache.kafka.log4jappender.KafkaLog4jAppender.activateOptions(KafkaLog4jAppender.java:273)
at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
at org.apache.spark.internal.Logging$.org$apache$spark$internal$Logging$$isLog4j12(Logging.scala:217)
at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:122)
at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:111)
at org.apache.spark.deploy.yarn.ApplicationMaster$.initializeLogIfNecessary(ApplicationMaster.scala:771)
at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:102)
at org.apache.spark.deploy.yarn.ApplicationMaster$.initializeLogIfNecessary(ApplicationMaster.scala:771)
at org.apache.spark.internal.Logging$class.log(Logging.scala:49)
at org.apache.spark.deploy.yarn.ApplicationMaster$.log(ApplicationMaster.scala:771)
at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:786)
at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
Caused by: java.lang.NullPointerException
at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:418)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:357)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:383)
at org.apache.kafka.clients.CommonClientConfigs.<clinit>(CommonClientConfigs.java:32)
... 28 more
</details>
# 答案1
**得分**: 1
我们在CDH 6.3.3中遇到了同样的问题。
看起来`kafka-log4j-appender:2.2.1-cdh6.3.3`和`slf4j-api:1.7.25`这些库存在问题。
我们使用的解决方法是指定不同版本的`slf4j-api`和`slf4j-log4j12`库(1.7版本不起作用,我们不得不使用1.8版本)。
解决此问题的步骤如下:
- 首先,您需要在`pom.xml`文件中将这些库指定为依赖...
<properties>
<slf4j.version>1.8.0-beta4</slf4j.version>
</properties>
<dependencies>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>${slf4j.version}</version>
</dependency>
- 使用插件将这些依赖项复制到`target`目录中...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>3.1.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
</execution>
</executions>
<configuration>
<includeScope>provided</includeScope>
<outputDirectory>target</outputDirectory>
<includeArtifactIds>slf4j-api,slf4j-log4j12</includeArtifactIds>
<stripVersion>true</stripVersion>
</configuration>
</plugin>
- 在Spark中包含以下属性:
spark.driver.extraClassPath=slf4j-api.jar:slf4j-log4j12.jar
spark.executor.extraClassPath=slf4j-api.jar:slf4j-log4j12.jar
- 最后,在`spark-submit`中使用此参数:
--jars slf4j-api.jar,slf4j-log4j12.jar
<details>
<summary>英文:</summary>
We had the same problem with CDH 6.3.3
Looks like there are a problem with the libraries kafka-log4j-appender:2.2.1-cdh6.3.3 and slf4j-api:1.7.25.
The workarround that used was specify a different version of slf4j-api and slf4j-log4j12 libraries (Didn't work with 1.7 version and we had to use the 1.8 version).
The steps to solve this issue are:
- First you have to specify these libraries as dependency in the pom.xml file ...
<properties>
<slf4j.version>1.8.0-beta4</slf4j.version>
</properties>
<dependencies>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>${slf4j.version}</version>
</dependency>
- Use a plugin to copy this dependencies to `target` directory ...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>3.1.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
</execution>
</executions>
<configuration>
<includeScope>provided</includeScope>
<outputDirectory>target</outputDirectory>
<includeArtifactIds>slf4j-api,slf4j-log4j12</includeArtifactIds>
<stripVersion>true</stripVersion>
</configuration>
</plugin>
- Include the following properties with spark:
spark.driver.extraClassPath=slf4j-api.jar:slf4j-log4j12.jar
spark.executor.extraClassPath=slf4j-api.jar:slf4j-log4j12.jar
- And finally use this parameter with `spark-submit`:
--jars slf4j-api.jar,slf4j-log4j12.jar
</details>
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论