如何在IntelliJ中从Spark的log4j-defaults.properties文件中筛选日志。

huangapple go评论68阅读模式
英文:

How to filter Logs from the Spark log4j-defaults.properties file in IntelliJ

问题

我正在使用IntelliJ和Maven依赖项运行一个Spark作业。问题是,控制台日志是由Spark的log4j-defaults.properties文件触发的,日志中充满了我真的不想要的INFO日志。

我想修改日志文件以摆脱INFO日志(或任何日志)。或者可以在不修改属性文件的情况下过滤INFO日志。问题是我无法修改IntelliJ中Maven依赖项中的log4j-defaults.properties文件,或者至少我没有找到如何做到这一点的方法。

有任何想法吗?

英文:

I am running a Spark job in IntelliJ using the Maven dependencies. The problem is that the console logs are triggered by the log4j-defaults.properties file from Spark and the logs are full of INFO logs that I really don't want to have.

I would like to modify the log file to get rid of the INFO logs (or any log). Or maybe filter the INFO logs without modifying the properties file. The problem is that I can not modify the log4j-defaults.properties file from the Maven dependency in IntelliJ, or at least I didn't find the way to do it.

Any idea?

答案1

得分: 1

在获取Spark上下文之后,您可以尝试:

sparkContext.setLogLevel("OFF")

这样您将不会看到任何日志。
您还可以使用"ERROR",它只会显示错误级别及以上的日志。

有效的日志级别包括:ALL、DEBUG、ERROR、FATAL、INFO、OFF、TRACE、WARN。

英文:

After getting the spark context you can try:
sparkContext.setLogLevel("OFF")
and you will not see logs at all.
you can also use "ERROR" and it will only show logs that are error or above.

Valid log levels include: ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN

答案2

得分: 0

把这些代码放在你的Spark Scala代码的开头:

import org.apache.log4j.{Level, Logger}
Logger.getRootLogger.setLevel(Level.OFF)
英文:

Place these lines in the beginning of your spark-scala code:

import org.apache.log4j.{Level, Logger}
Logger.getRootLogger.setLevel(Level.OFF)

huangapple
  • 本文由 发表于 2020年9月8日 19:36:57
  • 转载请务必保留本文链接:https://go.coder-hub.com/63793083.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定