run spark on YARN with yarn-client mode ,got an error: SparkContext: Master yarn-client does not match expected format

Skip to first unread message


Dec 24, 2013, 2:53:28 AM12/24/13
I followed the documentation  ,  launch spark app with yarn-client mode.

this is my command:

SPARK_JAR=/root/spark/spark-0.8.0-incubating/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop2.0.0-cdh4.3.0.jar \

SPARK_YARN_APP_JAR=examples/target/scala-2.9.3/spark-examples-assembly-0.8.0-incubating.jar \

./run-example org.apache.spark.examples.SparkPi yarn-client

then I got an error:

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/root/spark/spark-0.8.0-incubating/examples/target/scala-2.9.3/spark-examples-assembly-0.8.0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/root/spark/spark-0.8.0-incubating/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop2.0.0-cdh4.3.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
13/12/24 15:39:28 INFO Slf4jEventHandler: Slf4jEventHandler started
13/12/24 15:39:29 INFO SparkEnv: Registering BlockManagerMaster
13/12/24 15:39:29 INFO MemoryStore: MemoryStore started with capacity 1256.9 MB.
13/12/24 15:39:29 INFO DiskStore: Created local directory at /tmp/spark-local-20131224153929-fec4
13/12/24 15:39:29 INFO ConnectionManager: Bound socket to port 42147 with id = ConnectionManagerId(kmHadoop3,42147)
13/12/24 15:39:29 INFO BlockManagerMaster: Trying to register BlockManager
13/12/24 15:39:29 INFO BlockManagerMaster: Registered BlockManager
13/12/24 15:39:30 INFO HttpBroadcast: Broadcast server started at
13/12/24 15:39:30 INFO SparkEnv: Registering MapOutputTracker
13/12/24 15:39:30 INFO HttpFileServer: HTTP File server directory is /tmp/spark-4bfef083-f05a-485c-8e1d-fa7518eb7116
13/12/24 15:39:32 INFO SparkUI: Started Spark Web UI at http://testSpark:4040
13/12/24 15:39:40 INFO SparkContext: Added JAR /root/spark/spark-0.8.0-incubating/examples/target/scala-2.9.3/spark-examples-assembly-0.8.0-incubating.jar at with timestamp 1387870780009
13/12/24 15:39:40 WARN SparkContext: Master yarn-client does not match expected format, parsing as Mesos URL
Failed to load native Mesos library from /usr/java/jdk1.6.0_31/jre/lib/amd64/server:/usr/java/jdk1.6.0_31/jre/lib/amd64:/usr/java/jdk1.6.0_31/jre/../lib/amd64:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
Exception in thread "main" java.lang.UnsatisfiedLinkError: no mesos in java.library.path
        at java.lang.ClassLoader.loadLibrary(
        at java.lang.Runtime.loadLibrary0(
        at java.lang.System.loadLibrary(
        at org.apache.mesos.MesosNativeLibrary.load(
        at org.apache.mesos.MesosNativeLibrary.load(
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:216)
        at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
        at org.apache.spark.examples.SparkPi.main(SparkPi.scala)

look at the red line of the log above: Master yarn-client does not match expected format

Why yarn-client does not match expected format?  I mean that is exactly what the documentation suggests ,right?

My spark version is 0.8.0, and used hadoop with -hadoop2.0.0-cdh4.3.0.jar, as the log shows above.

Any help or reply is very appriciated !  Thanks very much

Message has been deleted
Message has been deleted
Message has been deleted


Dec 24, 2013, 9:20:51 PM12/24/13
Never mind , I found out I followed  the wrong version of documentation .

yarn-client mode is supported by the version-0.8.1, which is just updated recently. And I used 0.8.0, which only supports yarn-standalone mode.
Reply all
Reply to author
0 new messages