英文:
How to filter Logs from the Spark log4j-defaults.properties file in IntelliJ
问题
我正在使用IntelliJ和Maven依赖项运行一个Spark
作业。问题是,控制台日志是由Spark的log4j-defaults.properties
文件触发的,日志中充满了我真的不想要的INFO日志。
我想修改日志文件以摆脱INFO日志(或任何日志)。或者可以在不修改属性文件的情况下过滤INFO日志。问题是我无法修改IntelliJ中Maven
依赖项中的log4j-defaults.properties
文件,或者至少我没有找到如何做到这一点的方法。
有任何想法吗?
英文:
I am running a Spark
job in IntelliJ using the Maven
dependencies. The problem is that the console logs are triggered by the log4j-defaults.properties
file from Spark and the logs are full of INFO logs that I really don't want to have.
I would like to modify the log file to get rid of the INFO logs (or any log). Or maybe filter the INFO logs without modifying the properties file. The problem is that I can not modify the log4j-defaults.properties
file from the Maven
dependency in IntelliJ, or at least I didn't find the way to do it.
Any idea?
答案1
得分: 1
在获取Spark上下文之后,您可以尝试:
sparkContext.setLogLevel("OFF")
这样您将不会看到任何日志。
您还可以使用"ERROR",它只会显示错误级别及以上的日志。
有效的日志级别包括:ALL、DEBUG、ERROR、FATAL、INFO、OFF、TRACE、WARN。
英文:
After getting the spark context you can try:
sparkContext.setLogLevel("OFF")
and you will not see logs at all.
you can also use "ERROR" and it will only show logs that are error or above.
Valid log levels include: ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN
答案2
得分: 0
把这些代码放在你的Spark Scala代码的开头:
import org.apache.log4j.{Level, Logger}
Logger.getRootLogger.setLevel(Level.OFF)
英文:
Place these lines in the beginning of your spark-scala code:
import org.apache.log4j.{Level, Logger}
Logger.getRootLogger.setLevel(Level.OFF)
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论