英文:
How can I configure Log4j2 to roll log files eagerly?
问题
我有一些日志文件,我想每天都进行切割并压缩存档成gzip格式。
相关的配置如下:
<Appenders>
<RollingFile name="MyLog"
fileName="${sys:log.dir}/mylog.log"
filePattern="${sys:log.dir}/mylog-%d{yyyy-MM-dd}.log.gz">
<PatternLayout pattern="%d %p | %m | %c{1.} [%t]%n"/>
<Policies>
<TimeBasedTriggeringPolicy/>
</Policies>
</RollingFile>
</Appenders>
<Loggers>
<Root level="INFO">
<AppenderRef ref="MyLog"/>
</Root>
</Loggers>
因此,日志目录可能如下所示:
mylog-2020-01-01.log.gz
mylog-2020-01-02.log.gz
mylog-2020-01-03.log.gz
mylog.log // 今天的日志(假设是1月4日)
这个方法工作得很好,除了一个问题:日志文件在一天结束后并不会立即被gzip压缩,而是在第二天的第一条日志附加时才会懒惰地进行压缩。
由于我记录的内容(与客户会话相关),这可能要等到凌晨之后的很长时间才会发生。这可能是几个小时,甚至是几天。
如果我能够方便地查看我感兴趣的日期的日志文件,那将会很方便。目前,我无法可靠地这样做,因为昨天的日志可能还没有切割;在这种情况下,mylog.log
不是今天的日志,而是昨天的日志。
有没有一种方法可以配置它以便立即切割文件,而不是懒惰地切割?
我可以添加一个cron作业来进行清理,但我有所犹豫,因为如果在午夜附近有日志记录,我不希望去操作Log4j正在尝试写入的文件。这似乎容易出错。
英文:
I have some log files which I want to roll over daily and archive into gzips.
The relevant configuration looks like this:
<Appenders>
<RollingFile name="MyLog"
fileName="${sys:log.dir}/mylog.log"
filePattern="${sys:log.dir}/mylog-%d{yyyy-MM-dd}.log.gz">
<PatternLayout pattern="%d %p | %m | %c{1.} [%t]%n"/>
<Policies>
<TimeBasedTriggeringPolicy/>
</Policies>
</RollingFile>
</Appenders>
<Loggers>
<Root level="INFO">
<AppenderRef ref="MyLog"/>
</Root>
</Loggers>
So a log directory could look like
mylog-2020-01-01.log.gz
mylog-2020-01-02.log.gz
mylog-2020-01-03.log.gz
mylog.log // todays logs (let's say its the 4th Jan)
This works fine except that the log file is not gzipped eagerly at the end of the day, it's zipped lazily whenever the first log statement of the following day is appended.
Due to the nature of what I'm logging (related to client sessions), this might not happen until significantly after midnight. It could be hours, or even days.
It would be convenient to be able to ls
the files in the log directory for the date I'm interested in. At the moment, I can't do that reliably because there is the possibility that yesterday's logs haven't rolled yet; in that case mylog.log
is not the logs from today, but the logs from yesterday.
Is there a way to configure it so that it will roll the files eagerly, not lazily?
I could add a cronjob to do the clean-up but I'm hesitant because if there is logging going on around midnight, I don't want to be messing around with files which log4j is trying to write to. It seems like it would be error-prone.
答案1
得分: 3
以下是翻译好的内容:
看起来问题出在 TimeBasedTriggeringPolicy
上
TimeBasedTriggeringPolicy
会在日期/时间模式不再适用于活动文件时执行一次文件滚动
似乎基于 cron 的策略适合我
CronTriggeringPolicy
基于 cron 表达式触发滚动。此策略受定时器控制,与处理日志事件异步进行,因此可能会出现来自前一个或下一个时间段的日志事件出现在日志文件的开头或结尾
每天午夜执行:
<Policies>
<CronTriggeringPolicy schedule="0 0 0 * * *"/>
</Policies>
英文:
It looks like the problem is the TimeBasedTriggeringPolicy
> The TimeBasedTriggeringPolicy
causes a rollover once the date/time
> pattern no longer applies to the active file
It seems like a cron-based policy will work for me
> The CronTriggeringPolicy
triggers rollover based on a cron
> expression. This policy is controlled by a timer and is asynchronous
> to processing log events so it is possible that log events from the previous or next
> time period may appear at the beginning or end of the log file
To execute at midnight every day:
<Policies>
<CronTriggeringPolicy schedule="0 0 0 * * *"/>
</Policies>
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论