Why can't I get Spark running infos on Dr.Elephant UI ?

212 views
Skip to first unread message

Jack Java

unread,
May 16, 2016, 1:09:54 AM5/16/16
to dr-elephant-users
Why can't I get Spark running details on Dr.Elephant UI ?
Hi, everybody, I am Jake, come from Beijing, China.
last week, I installed drelephant, Hive2.0.0 and Spark1.4.0
MapReduce and Hive2.0.0 are running OK with Dr.Elephant, 
But there is something wrong with Spark Dr.Elephant's UI only display Spark's application-id without running details, as the pic shows below...

when I click one specific item ( for example: [root] [Spark] application_1463124006951_0041 ), it gives me nothing...


as you can see from the pic...   totally nothing~


Spark Memory Limit:  shows none
Spark Stage Runtime:  shows noneSpark Job Runtime:           shows noneSpark Executor Load Balance:          shows none




I use Spark-Submit command line, as below, for testing.

======================================================================================================================================================
spark-submit \
--name "Spark WordCount" \
--class main.scala.com.firstshare.jsonLogs.JSON_MonitorRequest.SparkWordCount \
--master yarn-cluster \
--conf spark.shuffle.spill=false \
--executor-memory 32M \
--driver-memory 64M \
--num-executors 2 \
/home/fsdevops/SparkJar/testSpark-1.0.0.jar \
10
======================================================================================================================================================
And I can see ElephantRunner is trying to analyse Spark's *.snappy files, as the log prints below...
======================================================================================================================================================
05-16-2016 12:51:46 INFO  com.linkedin.drelephant.ElephantRunner : Job queue size is 1
05-16-2016 12:51:46 INFO  com.linkedin.drelephant.ElephantRunner : Executor thread 1 analyzing SPARK application_1463124006951_0052
05-16-2016 12:51:46 INFO  org.apache.spark.deploy.history.SparkFSFetcher$ : Replaying Spark logs for application: application_1463124006951_0052
05-16-2016 12:51:46 INFO  org.apache.spark.deploy.history.SparkFSFetcher$ : Replay completed for application: application_1463124006951_0052
05-16-2016 12:51:47 ERROR com.linkedin.drelephant.util.InfoExtractor : Unable to retrieve the scheduler info for application [application_1463124006951_0052]. It does not contain [spark.driver.extraJavaOptions] property in its spark properties.
05-16-2016 12:52:43 INFO  com.linkedin.drelephant.ElephantRunner : Fetching analytic job list...
05-16-2016 12:52:43 INFO  com.linkedin.drelephant.analysis.AnalyticJobGeneratorHadoop2 : Fetching recent finished application runs between last time: 1463374243946, and current time: 1463374303945
05-16-2016 12:52:43 INFO  com.linkedin.drelephant.analysis.AnalyticJobGeneratorHadoop2 : The succeeded apps URL is http://0.0.0.0:8088/ws/v1/cluster/apps?finalStatus=SUCCEEDED&finishedTimeBegin=1463374243946&finishedTimeEnd=1463374303945
05-16-2016 12:52:44 INFO  com.linkedin.drelephant.analysis.AnalyticJobGeneratorHadoop2 : The failed apps URL is http://0.0.0.0:8088/ws/v1/cluster/apps?finalStatus=FAILED&finishedTimeBegin=1463374243946&finishedTimeEnd=1463374303945
05-16-2016 12:52:44 INFO  com.linkedin.drelephant.ElephantRunner : Job queue size is 1
05-16-2016 12:52:44 INFO  com.linkedin.drelephant.ElephantRunner : Executor thread 3 analyzing SPARK application_1463124006951_0053
05-16-2016 12:52:44 INFO  org.apache.spark.deploy.history.SparkFSFetcher$ : Replaying Spark logs for application: application_1463124006951_0053
05-16-2016 12:52:44 INFO  org.apache.spark.deploy.history.SparkFSFetcher$ : Replay completed for application: application_1463124006951_0053
05-16-2016 12:52:44 ERROR com.linkedin.drelephant.util.InfoExtractor : Unable to retrieve the scheduler info for application [application_1463124006951_0053]. It does not contain [spark.driver.extraJavaOptions] property in its spark properties.
======================================================================================================================================================
I notice an error message, it says doesnt contain some [spark.driver.extraJavaOptions] configuration params, and it should be config properly in spark-defaults.conf,
But I don't konw if it's the source that makes me cant get any Spark running detail infos, 
It will be greatly thankful If you can guide me how to solve the problem~!!!


Fawze Abujaber

unread,
Mar 26, 2018, 2:15:22 PM3/26/18
to dr-elephant-users
Hi Jake,

Where you able to solve it?
Reply all
Reply to author
Forward
0 new messages