change verbose level of Spark Shell

0 views
Skip to first unread message

ritch...@abrs.com.hk

unread,
Sep 28, 2018, 7:59:07 AM9/28/18
to Machine Intelligence and Data Science WonderCorp
I encountered this in my IBM Spark Fundamentals - Lab Configuring and Monitoring Spark Applications where I changed the verbosity setting yet failed to work.

After some research, I found the solution as following.


To reduce the output verbose level: (there are three levels: INFO, WARN, ERROR)
 
1. In the SPARK_HOME/conf/log4j.properties   (in my case SPARK_HOME is usr/local/spark, you may try:  echo  $SPARK_HOME/ to get your path if not sure)
    set  log4j.rootCategory=WARN  (or = ERROR if just to show errors)

2. Please also note that if there is another overriding shell level setting, such as:

log4j.logger.org.apache.spark.repl.Main=WARN, or

shell.log.level=INFO

then you need to instead change this overriding shell level setting to make it work in the shell level.

The way the setting work is like Setting = S1 Override > S2 Override > S3.......SN,   where S1, S2.. are the priority of setting. 


Please also note that there are other overriding settings (on sc SparkCong or flags passed to spark-submit) which might override your default settings

Remarks: I am using Spark version 2.2.1
   
Reply all
Reply to author
Forward
0 new messages