Server: HP01 (10.0.1.201)Built: 2012-03-22 21:43:55 by hadoopStarted: Tue Mar 27 2012 09:41:54 PM CST (2012-03-27 13:41:54 UTC)ID: 201203272141-922681334-5050-4903
ssh ${MESOS_HOME}/deploy/mesos-daemon mesos-masters.sh </dev/null >/dev/nullssh ${MESOS_HOME}/deploy/mesos-daemon mesos-slaves.sh </dev/null >/dev/null
> ./run spark.examples.SparkLR master@hp01:5050
12/03/28 00:34:01 INFO spark.BoundedMemoryCache: BoundedMemoryCache.maxBytes = 333572997
12/03/28 00:34:01 INFO spark.CacheTrackerActor: Registered actor on port 7077
12/03/28 00:34:01 INFO spark.MapOutputTrackerActor: Registered actor on port 7077
12/03/28 00:34:01 INFO spark.ShuffleManager: Shuffle dir: /tmp/spark-local-149ffc1c-e304-4625-bfd3-6c9530c8f0a4/shuffle
12/03/28 00:34:01 INFO server.Server: jetty-7.5.3.v20111011
12/03/28 00:34:01 INFO server.AbstractConnector: Started SelectChann...@0.0.0.0:47967 STARTING
12/03/28 00:34:01 INFO spark.ShuffleManager: Local URI: http://10.0.1.201:47967
java.lang.UnsatisfiedLinkError: no mesos in java.library.pathat java.lang.ClassLoader.loadLibrary(ClassLoader.java:1738)at java.lang.Runtime.loadLibrary0(Runtime.java:823)at java.lang.System.loadLibrary(System.java:1028)at spark.SparkContext.<init>(SparkContext.scala:75)at spark.examples.SparkLR$.main(SparkLR.scala:31)at spark.examples.SparkLR.main(SparkLR.scala)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)at java.lang.reflect.Method.invoke(Method.java:597)at scala.tools.nsc.util.ScalaClassLoader$$anonfun$run$1.apply(ScalaClassLoader.scala:78)at scala.tools.nsc.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:24)at scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:88)at scala.tools.nsc.util.ScalaClassLoader$class.run(ScalaClassLoader.scala:78)at scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:101)at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:33)at scala.tools.nsc.ObjectRunner$.runAndCatch(ObjectRunner.scala:40)at scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:56)at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:80)at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:89)at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
12/03/28 01:35:05 INFO spark.BoundedMemoryCache: BoundedMemoryCache.maxBytes = 33958526912/03/28 01:35:05 INFO spark.CacheTrackerActor: Registered actor on port 707712/03/28 01:35:05 INFO spark.ShuffleManager: Shuffle dir: /tmp/spark-local-8b8764e7-d6aa-417d-af4f-a63b2e9e39dc/shuffle12/03/28 01:35:05 INFO spark.MapOutputTrackerActor: Registered actor on port 707712/03/28 01:35:05 INFO server.Server: jetty-7.5.3.v2011101112/03/28 01:35:05 INFO server.AbstractConnector: Started SelectChann...@0.0.0.0:54892 STARTING12/03/28 01:35:05 INFO spark.ShuffleManager: Local URI: http://10.0.1.201:54892java.lang.NoClassDefFoundError: com/google/protobuf/ProtocolMessageEnumat java.lang.ClassLoader.defineClass1(Native Method)at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)at java.lang.ClassLoader.defineClass(ClassLoader.java:615)at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)at java.net.URLClassLoader.access$000(URLClassLoader.java:58)at java.net.URLClassLoader$1.run(URLClassLoader.java:197)at java.security.AccessController.doPrivileged(Native Method)at java.net.URLClassLoader.findClass(URLClassLoader.java:190)at sun.misc.Launcher$ExtClassLoader.findClass(Launcher.java:229)at java.lang.ClassLoader.loadClass(ClassLoader.java:306)at java.lang.ClassLoader.loadClass(ClassLoader.java:295)at scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.scala$tools$nsc$util$ScalaClassLoader$$super$loadClass(ScalaClassLoader.scala:88)at scala.tools.nsc.util.ScalaClassLoader$class.loadClass(ScalaClassLoader.scala:50)at scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.loadClass(ScalaClassLoader.scala:88)at java.lang.ClassLoader.loadClass(ClassLoader.java:247)at spark.SparkContext.<init>(SparkContext.scala:76)
at spark.examples.SparkLR$.main(SparkLR.scala:31)at spark.examples.SparkLR.main(SparkLR.scala)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)at java.lang.reflect.Method.invoke(Method.java:597)at scala.tools.nsc.util.ScalaClassLoader$$anonfun$run$1.apply(ScalaClassLoader.scala:78)at scala.tools.nsc.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:24)at scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:88)at scala.tools.nsc.util.ScalaClassLoader$class.run(ScalaClassLoader.scala:78)at scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:101)at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:33)at scala.tools.nsc.ObjectRunner$.runAndCatch(ObjectRunner.scala:40)at scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:56)at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:80)at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:89)at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
Caused by: java.lang.ClassNotFoundException: com.google.protobuf.ProtocolMessageEnumat java.net.URLClassLoader$1.run(URLClassLoader.java:202)at java.security.AccessController.doPrivileged(Native Method)at java.net.URLClassLoader.findClass(URLClassLoader.java:190)at sun.misc.Launcher$ExtClassLoader.findClass(Launcher.java:229)at java.lang.ClassLoader.loadClass(ClassLoader.java:306)at java.lang.ClassLoader.loadClass(ClassLoader.java:247)... 34 more
First, you may need to use an older SVN revision of Mesos to work with the latest Spark. The current version in SVN has some incompatible API changes that we haven't updated Spark to work with yet. Check out revision 1205738 of Mesos as follows:
svn checkout -r 1205738 http://svn.apache.org/repos/asf/incubator/mesos/trunk mesos
After that, you ought to be able to compile it and it will run without any changes. Then to have Spark find the libraries, you need to create a file conf/spark-env.sh in your Spark directory (copy spark-env.sh.template that's there), and in it add export MESOS_HOME=<path to Mesos>.
These instructions are up at https://github.com/mesos/spark/wiki/Running-spark-on-mesos. Hope this helps.
Matei
./run spark.examples.SparkLR master@hp01:505012/03/29 19:20:36 INFO spark.BoundedMemoryCache: BoundedMemoryCache.maxBytes = 33958526912/03/29 19:20:36 INFO spark.CacheTrackerActor: Registered actor on port 707712/03/29 19:20:36 INFO spark.MapOutputTrackerActor: Registered actor on port 707712/03/29 19:20:36 INFO spark.ShuffleManager: Shuffle dir: /tmp/spark-local-9b7f01a7-3883-43e7-901d-1413946a2466/shuffle12/03/29 19:20:36 INFO server.Server: jetty-7.5.3.v2011101112/03/29 19:20:36 INFO server.AbstractConnector: Started SelectChann...@0.0.0.0:56184 STARTING12/03/29 19:20:36 INFO spark.ShuffleManager: Local URI: http://10.0.1.201:56184
/home/hadoop/SPARK/mesos:/home/hadoop/SPARK/spark/core/target/scala-2.9.1/classes:/home/hadoop/SPARK/mesos/lib/java/mesos.jar:/home/hadoop/SPARK/spark/conf:/home/hadoop/SPARK/spark/repl/target/scala-2.9.1/classes:/home/hadoop/SPARK/spark/examples/target/scala-2.9.1/classes:/home/hadoop/SPARK/spark/core/lib/mesos.jar:/home/hadoop/SPARK/spark/lib_managed/jars/ant/ant/ant-1.6.5.jar:/home/hadoop/SPARK/spark/lib_managed/jars/com.googlecode/kryo/kryo-1.04.jar:/home/hadoop/SPARK/spark/lib_managed/jars/com.googlecode/minlog/minlog-1.2.jar:/home/hadoop/SPARK/spark/lib_managed/jars/com.googlecode/reflectasm/reflectasm-1.01.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.scala-tools.testing/scalacheck_2.9.1/scalacheck_2.9.1-1.9.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.scala-tools.testing/test-interface/test-interface-0.5.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.slf4j/slf4j-api/slf4j-api-1.6.1.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.slf4j/slf4j-log4j12/slf4j-log4j12-1.6.1.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.scalatest/scalatest_2.9.1/scalatest_2.9.1-1.6.1.jar:/home/hadoop/SPARK/spark/lib_managed/jars/it.unimi.dsi/fastutil/fastutil-6.4.2.jar:/home/hadoop/SPARK/spark/lib_managed/jars/com.google.protobuf/protobuf-java/protobuf-java-2.3.0.jar:/home/hadoop/SPARK/spark/lib_managed/jars/concurrent/concurrent/concurrent-1.3.4.jar:/home/hadoop/SPARK/spark/lib_managed/jars/net.java.dev.jets3t/jets3t/jets3t-0.7.1.jar:/home/hadoop/SPARK/spark/lib_managed/jars/net.sf.kosmosfs/kfs/kfs-0.3.jar:/home/hadoop/SPARK/spark/lib_managed/jars/commons-el/commons-el/commons-el-1.0.jar:/home/hadoop/SPARK/spark/lib_managed/jars/hsqldb/hsqldb/hsqldb-1.8.0.10.jar:/home/hadoop/SPARK/spark/lib_managed/jars/com.google.code.findbugs/jsr305/jsr305-1.3.9.jar:/home/hadoop/SPARK/spark/lib_managed/jars/javax.servlet/servlet-api/servlet-api-2.5.jar:/home/hadoop/SPARK/spark/lib_managed/jars/junit/junit/junit-4.5.jar:/home/hadoop/SPARK/spark/lib_managed/jars/xmlenc/xmlenc/xmlenc-0.52.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.eclipse.jdt/core/core-3.1.1.jar:/home/hadoop/SPARK/spark/lib_managed/jars/colt/colt/colt-1.2.0.jar:/home/hadoop/SPARK/spark/lib_managed/jars/de.javakaffee/kryo-serializers/kryo-serializers-0.9.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.apache.hadoop/hadoop-core/hadoop-core-0.20.2.jar:/home/hadoop/SPARK/spark/lib_managed/jars/tomcat/jasper-compiler/jasper-compiler-5.5.12.jar:/home/hadoop/SPARK/spark/lib_managed/jars/tomcat/jasper-runtime/jasper-runtime-5.5.12.jar:/home/hadoop/SPARK/spark/lib_managed/jars/com.google.guava/guava/guava-11.0.1.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.eclipse.jetty/jetty-http/jetty-http-7.5.3.v20111011.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.eclipse.jetty/jetty-server/jetty-server-7.5.3.v20111011.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.eclipse.jetty/jetty-continuation/jetty-continuation-7.5.3.v20111011.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.eclipse.jetty/jetty-util/jetty-util-7.5.3.v20111011.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.eclipse.jetty/jetty-io/jetty-io-7.5.3.v20111011.jar:/home/hadoop/SPARK/spark/lib_managed/jars/asm/asm-all/asm-all-3.3.1.jar:/home/hadoop/SPARK/spark/lib_managed/jars/asm/asm/asm-3.2.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.mortbay.jetty/servlet-api-2.5/servlet-api-2.5-6.1.14.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.mortbay.jetty/jsp-api-2.1/jsp-api-2.1-6.1.14.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.mortbay.jetty/jsp-2.1/jsp-2.1-6.1.14.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.mortbay.jetty/jetty-util/jetty-util-6.1.14.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.mortbay.jetty/jetty/jetty-6.1.14.jar:/home/hadoop/SPARK/spark/lib_managed/jars/oro/oro/oro-2.0.8.jar:/home/hadoop/SPARK/spark/lib_managed/jars/commons-net/commons-net/commons-net-1.4.1.jar:/home/hadoop/SPARK/spark/lib_managed/jars/commons-codec/commons-codec/commons-codec-1.3.jar:/home/hadoop/SPARK/spark/lib_managed/jars/commons-httpclient/commons-httpclient/commons-httpclient-3.0.1.jar:/home/hadoop/SPARK/spark/lib_managed/jars/commons-cli/commons-cli/commons-cli-1.2.jar:/home/hadoop/SPARK/spark/lib_managed/jars/commons-logging/commons-logging/commons-logging-1.1.1.jar:/home/hadoop/SPARK/spark/lib_managed/bundles/com.ning/compress-lzf/compress-lzf-0.8.4.jar:/home/hadoop/SPARK/spark/lib_managed/bundles/log4j/log4j/log4j-1.2.16.jar:/home/hadoop/SPARK/spark/lib_managed/bundles/org.jboss.netty/netty/netty-3.2.6.Final.jar:/home/hadoop/SPARK/spark/repl/lib/scala-jline.jar:/home/hadoop/SPARK/spark/bagel/target/scala-2.9.1/classes
And although it says scala: not found in stderr, I can actually run scala on each machine. (On other machines, there are also stderr saying scala: not found)
On iteration 112/04/01 04:51:40 INFO spark.SparkContext: Starting job...12/04/01 04:51:40 INFO spark.CacheTracker: Registering RDD ID 1 with cache12/04/01 04:51:40 INFO spark.CacheTrackerActor: Registering RDD 1 with 2 partitions12/04/01 04:51:40 INFO spark.CacheTracker: Registering RDD ID 0 with cache12/04/01 04:51:40 INFO spark.CacheTrackerActor: Registering RDD 0 with 2 partitions12/04/01 04:51:40 INFO spark.CacheTrackerActor: Asked for current cache locations12/04/01 04:51:40 INFO spark.MesosScheduler: Final stage: Stage 012/04/01 04:51:40 INFO spark.MesosScheduler: Parents of final stage: List()12/04/01 04:51:40 INFO spark.MesosScheduler: Missing parents: List()12/04/01 04:51:40 INFO spark.MesosScheduler: Submitting Stage 0, which has no missing parents12/04/01 04:51:40 INFO spark.MesosScheduler: Got a job with 2 tasks12/04/01 04:51:40 INFO spark.MesosScheduler: Registered as framework ID 201203311232-0-000112/04/01 04:51:40 INFO spark.MesosScheduler: Adding job with ID 012/04/01 04:51:40 INFO spark.SimpleJob: Starting task 0:0 as TID 0 on slave 201203311232-0-1: pc02 (preferred)12/04/01 04:51:40 INFO spark.SimpleJob: Starting task 0:1 as TID 1 on slave 201203311232-0-5: HP01 (preferred)12/04/01 04:51:40 INFO spark.SimpleJob: Lost TID 1 (task 0:1)12/04/01 04:51:41 INFO spark.SimpleJob: Finished TID 0 (progress: 1/2)12/04/01 04:51:41 INFO spark.MesosScheduler: Completed ResultTask(0, 0)12/04/01 04:51:41 INFO spark.SimpleJob: Starting task 0:1 as TID 2 on slave 201203311232-0-1: pc02 (preferred)12/04/01 04:51:42 INFO spark.SimpleJob: Finished TID 2 (progress: 2/2)12/04/01 04:51:42 INFO spark.MesosScheduler: Completed ResultTask(0, 1)12/04/01 04:51:42 INFO spark.SparkContext: Job finished in 2.008355 s