SPARK_JAR=/root/spark/spark-0.8.0-incubating/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop2.0.0-cdh4.3.0.jar \
SPARK_YARN_APP_JAR=examples/target/scala-2.9.3/spark-examples-assembly-0.8.0-incubating.jar \
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/root/spark/spark-0.8.0-incubating/examples/target/scala-2.9.3/spark-examples-assembly-0.8.0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/root/spark/spark-0.8.0-incubating/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop2.0.0-cdh4.3.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
13/12/24 15:39:28 INFO Slf4jEventHandler: Slf4jEventHandler started
13/12/24 15:39:29 INFO SparkEnv: Registering BlockManagerMaster
13/12/24 15:39:29 INFO MemoryStore: MemoryStore started with capacity 1256.9 MB.
13/12/24 15:39:29 INFO DiskStore: Created local directory at /tmp/spark-local-20131224153929-fec4
13/12/24 15:39:29 INFO ConnectionManager: Bound socket to port 42147 with id = ConnectionManagerId(kmHadoop3,42147)
13/12/24 15:39:29 INFO BlockManagerMaster: Trying to register BlockManager
13/12/24 15:39:29 INFO BlockManagerMaster: Registered BlockManager
13/12/24 15:39:30 INFO SparkEnv: Registering MapOutputTracker
13/12/24 15:39:30 INFO HttpFileServer: HTTP File server directory is /tmp/spark-4bfef083-f05a-485c-8e1d-fa7518eb7116
13/12/24 15:39:40 WARN SparkContext: Master yarn-client does not match expected format, parsing as Mesos URL
Failed to load native Mesos library from /usr/java/jdk1.6.0_31/jre/lib/amd64/server:/usr/java/jdk1.6.0_31/jre/lib/amd64:/usr/java/jdk1.6.0_31/jre/../lib/amd64:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
Exception in thread "main" java.lang.UnsatisfiedLinkError: no mesos in java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1738)
at java.lang.Runtime.loadLibrary0(Runtime.java:823)
at java.lang.System.loadLibrary(System.java:1028)
at org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:52)
at org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:64)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:216)
at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
Why yarn-client does not match expected format? I mean that is exactly what the documentation suggests ,right?
My spark version is 0.8.0, and used hadoop with -hadoop2.0.0-cdh4.3.0.jar, as the log shows above.