druid release: 0.10.0
Hello Everybody!
The index_hadoop task on yarn nodemanager container (datanode) logs everything to the stdout and yarn-nodemanager saves this output in some configurable directory (yarn-site.xml:yarn.nodemanager.log-dirs) to the stdout file. Now the problem is that I can not control what is being output by druid indexer inside such a job container.
How can I set the log level to WARN? Perhaps I could somehow "submit" log4j2.xml for such job to the datanode, but how and where to put it? Configuring log level for yarn-nodemanager or mapreduce (using container-log4j.properties) didn't help.
In the submitted job I use separate classloader (maybe this is important):
"mapreduce.job.classloader": "true",
"mapreduce.job.classloader.system.classes": "-javax.validation.,java.,javax.,org.apache.commons.logging.,org.apache.log4j.,org.apache.hadoop."