Imply 2.8 and later
Imply 2.8 and later provide log rotation out of the box on supported platforms (Linux x86_64 and MacOS) through the supervise program. Logs appear in var/sv/<service>/current and are rotated to other files in that same directory, before eventually being deleted.
The remainder of this article is useful for Imply 2.7 and earlier.
Imply 2.7 and earlier
In Imply 2.7.x and earlier, Druid is configured to use `ConsoleAppender` for logging. However, ConsoleAppender does not roll over old logs and does not do archiving, so all logs of a service are written to a single file, which can grow very large in size over time.
A popular solution is to use `RollingFileAppender` for logging. More details can be found here: https://logging.apache.org/log4j/2.0/manual/appenders.html#RollingFileAppender
The following is an example of configuring `RollingFileAppender` for coordinator logging:
1. Configure `jvm.config` for coordinator by adding `-Dservice` at the end of file with the name of each service. It is important that each jvm.config have a different value for "service"!
[root@ip-172-31-12-141 coordinator]# cat /imply/imply-2.5.14/conf/druid/coordinator/jvm.config -server -Xms256m -Xmx256m -Duser.timezone=UTC -Dfile.encoding=UTF-8 -Djava.io.tmpdir=var/tmp -Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager -Dderby.stream.error.file=var/druid/derby.log -Dservice=coordinator
Similar configuration can be applied to other services, with changes in `-Dservice` value respectively.
2. back up the original the log4j2.xml file, then replace the content with following:
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="warn" name="Imply" packages="">
<Appenders>
<RollingFile name="RollingFile" fileName="var/sv/${sys:service}-service.log"
filePattern="var/sv/${sys:service}-service-%d{yyyy-MM-dd}.%i.log.gz">
<PatternLayout>
<Pattern>%d{ISO8601} %p [%t] %c - %m%n</Pattern>
</PatternLayout>
<Policies>
<TimeBasedTriggeringPolicy interval="1" modulate="true"/>
<SizeBasedTriggeringPolicy size="100 MB"/>
</Policies>
<DefaultRolloverStrategy max="10"/>
</RollingFile>
</Appenders>
<Loggers>
<Root level="info">
<AppenderRef ref="RollingFile"/>
</Root>
</Loggers>
</Configuration>
In this new configuration, we roll over and archive old logs in to gz format daily, or if the current log file grows larger than 100MB.
3. IMPORTANT: `RollingFileAppender` will break ingestion task logging. So we need to instruct tasks to still use `ConsoleAppender` for logging. To do so:
3a. Keep the default log4j2.xml and rename it to log4j2-task.xml. It will have the following contents:
<?xml version="1.0" encoding="UTF-8" ?> <Configuration status="WARN"> <Appenders> <Console name="Console" target="SYSTEM_OUT"> <PatternLayout pattern="%d{ISO8601} %p [%t] %c - %m%n"/> </Console> </Appenders> <Loggers> <Root level="info"> <AppenderRef ref="Console"/> </Root> </Loggers> </Configuration>
3b. add additional property `-Dlog4j.configurationFile` to middleManager's runtime.properties --> `druid.indexer.runner.javaOpts`, and point to the path of the `log4j2-task.xml` file:
[root@ip-172-31-12-141 middleManager]# cat /imply/imply-2.5.14/conf/druid/middleManager/runtime.properties ... # Task launch parameters druid.indexer.runner.javaOpts=-server -Xms512m -Xmx512m -Duser.timezone=UTC -Dfile.encoding=UTF-8 -Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager -Dlog4j.configurationFile=conf/druid/_common/log4j2-task.xml ...
4. The var/sv directory will look like this after the change is applied:
[root@ip-172-31-12-141 sv]# ls -lh -rw-r--r--. 1 root root 37K Jun 24 00:00 broker-service-2018-06-23.log.gz -rw-r--r--. 1 root root 14K Jun 25 02:00 broker-service-2018-06-24.log.gz -rw-r--r--. 1 root root 188K Jun 25 22:55 broker-service.log
-rw-r--r--. 1 root root 1K Jun 25 22:55 broker.log
-rw-r--r--. 1 root root 574K Jun 25 22:55 imply-ui.log
In this example, "broker-service.log" is Druid's log4j2 log, and "broker.log" is standard out. The standard-out log file will generally contain nothing at first. Eventually, it may accumulate logs that do not go through log4j2, including:
- GC logs (from the JVM PrintGCTimeStamps option).
- Crash logs.
- Thread dumps generated by "kill -3".
Comments
0 comments
Please sign in to leave a comment.