I encountered this in my IBM Spark Fundamentals - Lab Configuring and Monitoring Spark Applications where I changed the verbosity setting yet failed to work.
After some research, I found the solution as following.
To reduce the output verbose level: (there are three levels: INFO, WARN, ERROR)
1. In the SPARK_HOME/conf/log4j.properties (in my case SPARK_HOME is usr/local/spark, you may try: echo $SPARK_HOME/ to get your path if not sure)
set log4j.rootCategory=WARN (or = ERROR if just to show errors)
2. Please also note that if there is another overriding shell level setting, such as:
log4j.logger.org.apache.spark.repl.Main=WARN, or
then you need to instead change this overriding shell level setting to make it work in the shell level.
The way the setting work is like Setting = S1 Override > S2 Override > S3.......SN, where S1, S2.. are the priority of setting.
Please also note that there are other overriding settings (on sc SparkCong or flags passed to spark-submit) which might override your default settings
Remarks: I am using Spark version 2.2.1