Spark应用程序无法读取自定义的log4j.properties。

huangapple go评论49阅读模式
英文:

Spark application unable to read custom log4j.properties

问题

I have been at this for several days now, my objective is simple.

我已经进行了好几天,我的目标很简单。

I am setting up a SparkConf() object inside a Java Application and I need to specify a custom path to the log4j.properties file. The application is meant to run on a Spark Worker which has the custom log4j.properties file required.

我正在Java应用程序内设置一个SparkConf()对象,我需要指定一个自定义路径到log4j.properties文件。这个应用程序是要在一个具有所需的自定义log4j.properties文件的Spark Worker上运行。

It seems like my Spark configuration is unable to find this and is using the default file.

似乎我的Spark配置无法找到这个文件,而是使用默认文件。

I have added the log4j.properties file in several places inside the worker pod like: /app/spark/conf/log4j.properties. But it doesn't seem to work.

我已经在工作节点内的几个位置添加了log4j.properties文件,比如:/app/spark/conf/log4j.properties。但似乎没有起作用。

Here's how I'm trying to set the custom path:

这是我尝试设置自定义路径的方式:

        SparkConf sc = new SparkConf().setMaster(master)
                                      .set("spark.driver.extraJavaOptions", "-Dlog4j.configuration=/app/spark/conf/log4j.properties")
                                      .set("spark.executor.extraJavaOptions", "-Dlog4j.configuration=/app/spark/conf/log4j.properties")

The last two statements are currently having no effect on the Spark Configuration. Any idea what's wrong with this? Is something missing on my end?

最后两个语句目前对Spark配置没有影响。对此有什么想法吗?是我漏掉了什么吗?

Help...

帮帮忙...

英文:

I have been at this for several days now, my objective is simple.

I am setting up a SparkConf() object inside a Java Application and I need to specify a custom path to the log4j.properties file. The application is meant to run on a Spark Worker which has the custom log4j.properties file required.

It seems like my Spark configuration is unable to find this and is using the default file.

I have added the log4j.properties file in several places inside the worker pod like :/app/spark/conf/log4j.properties. But it doesn't seem to work.

Here's how I'm trying to set the custom path:

        SparkConf sc = new SparkConf().setMaster(master)
                                      .set("spark.driver.extraJavaOptions", "-Dlog4j.configuration=/app/spark/conf/log4j.properties")
                                      .set("spark.executor.extraJavaOptions", "-Dlog4j.configuration=/app/spark/conf/log4j.properties")

The last two statements are currently having no effect on the Spark Configuration. Any idea what's wrong with this? Is something missing on my end?

Help...

答案1

得分: 1

似乎您提供了路径,但没有包含关键字"file"。只需添加以下内容,如果文件位于该路径上,它应该能够引用您的文件...

SparkConf sc = new SparkConf().setMaster(master)
                              .set("spark.driver.extraJavaOptions", "-Dlog4j.configuration=file:/app/spark/conf/log4j.properties")
                              .set("spark.executor.extraJavaOptions", "-Dlog4j.configuration=file:/app/spark/conf/log4j.properties")
英文:

Seems like you are giving the path without the keyword "file" in it. Just add the following and it should be able to refer to your file if it exists on the path...

        SparkConf sc = new SparkConf().setMaster(master)
                                  .set("spark.driver.extraJavaOptions", "-Dlog4j.configuration=file:/app/spark/conf/log4j.properties")
                                  .set("spark.executor.extraJavaOptions", "-Dlog4j.configuration=file:/app/spark/conf/log4j.properties")

huangapple
  • 本文由 发表于 2023年2月16日 07:24:16
  • 转载请务必保留本文链接:https://go.coder-hub.com/75466358.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定