No mesos in java.library.path when running spark job in cluster mode

2,239 views
Skip to first unread message

Xianda Sun

unread,
Mar 27, 2012, 5:23:03 AM3/27/12
to spark...@googlegroups.com
We are trying to set up Spark on a cluster of four machines. And we can finish the example job in local mode,
but it failed when we try to run the example LR job using cluster mode.
We have installed mesos, it can run and through the web ui we can get information of the master and other th-
ree machines:
Server: HP01 (10.0.1.201)
Built: 2012-03-22 21:43:55 by hadoop
Started: Tue Mar 27 2012 09:41:54 PM CST (2012-03-27 13:41:54 UTC)
ID: 201203272141-922681334-5050-4903
But we are not sure if the mesos is installed correctly because it seems that there are some configuration file missing
in the version cloned from git (as well as the one checked from svn). We modified the start-masters and start-slaves 
scripts because the slave machines only have $MESOS_HOME but not ${prefix}/mesos-master file:
ssh ${MESOS_HOME}/deploy/mesos-daemon mesos-masters.sh </dev/null >/dev/null
ssh ${MESOS_HOME}/deploy/mesos-daemon mesos-slaves.sh </dev/null >/dev/null
And when we try to use spark to run the example LR job distributely:
> ./run spark.examples.SparkLR master@hp01:5050
It then reported an error :
12/03/28 00:34:01 INFO spark.BoundedMemoryCache: BoundedMemoryCache.maxBytes = 333572997
12/03/28 00:34:01 INFO spark.CacheTrackerActor: Registered actor on port 7077
12/03/28 00:34:01 INFO spark.MapOutputTrackerActor: Registered actor on port 7077
12/03/28 00:34:01 INFO spark.ShuffleManager: Shuffle dir: /tmp/spark-local-149ffc1c-e304-4625-bfd3-6c9530c8f0a4/shuffle
12/03/28 00:34:01 INFO server.Server: jetty-7.5.3.v20111011
12/03/28 00:34:01 INFO server.AbstractConnector: Started
SelectChann...@0.0.0.0:47967 STARTING
12/03/28 00:34:01 INFO spark.ShuffleManager: Local URI:
http://10.0.1.201:47967
java.lang.UnsatisfiedLinkError: no mesos in java.library.path
        at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1738)
        at java.lang.Runtime.loadLibrary0(Runtime.java:823)
        at java.lang.System.loadLibrary(System.java:1028)
        at spark.SparkContext.<init>(SparkContext.scala:75)
        at spark.examples.SparkLR$.main(SparkLR.scala:31)
        at spark.examples.SparkLR.main(SparkLR.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at scala.tools.nsc.util.ScalaClassLoader$$anonfun$run$1.apply(ScalaClassLoader.scala:78)
        at scala.tools.nsc.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:24)
        at scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:88)
        at scala.tools.nsc.util.ScalaClassLoader$class.run(ScalaClassLoader.scala:78)
        at scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:101)
        at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:33)
        at scala.tools.nsc.ObjectRunner$.runAndCatch(ObjectRunner.scala:40)
        at scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:56)
        at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:80)
        at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:89)
        at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
We copied the mesos.jar from $SPARK_HOME/core/lib/ to $MESOS_HOME/lib/java/ and the mistake remains.
 
Would anyone please tell me what is wrong with this mistake and how to fix it?
 
Best Regards
Message has been deleted

Xianda Sun

unread,
Mar 27, 2012, 5:57:14 AM3/27/12
to spark...@googlegroups.com
I copied libmesos.so from $MESOS_HOME/src/.lib to $MESOS_HOME/lib/java and recompress mesos.jar using
$MESOS_HOME/src/java/classes and it seems there are some errors about protobuf:
12/03/28 01:35:05 INFO spark.BoundedMemoryCache: BoundedMemoryCache.maxBytes = 339585269
12/03/28 01:35:05 INFO spark.CacheTrackerActor: Registered actor on port 7077
12/03/28 01:35:05 INFO spark.ShuffleManager: Shuffle dir: /tmp/spark-local-8b8764e7-d6aa-417d-af4f-a63b2e9e39dc/shuffle
12/03/28 01:35:05 INFO spark.MapOutputTrackerActor: Registered actor on port 7077
12/03/28 01:35:05 INFO server.Server: jetty-7.5.3.v20111011
12/03/28 01:35:05 INFO server.AbstractConnector: Started SelectChann...@0.0.0.0:54892 STARTING
12/03/28 01:35:05 INFO spark.ShuffleManager: Local URI: http://10.0.1.201:54892
java.lang.NoClassDefFoundError: com/google/protobuf/ProtocolMessageEnum
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
        at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
        at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at sun.misc.Launcher$ExtClassLoader.findClass(Launcher.java:229)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:295)
        at scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.scala$tools$nsc$util$ScalaClassLoader$$super$loadClass(ScalaClassLoader.scala:88)
        at scala.tools.nsc.util.ScalaClassLoader$class.loadClass(ScalaClassLoader.scala:50)
        at scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.loadClass(ScalaClassLoader.scala:88)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
        at spark.SparkContext.<init>(SparkContext.scala:76)
        at spark.examples.SparkLR$.main(SparkLR.scala:31)
        at spark.examples.SparkLR.main(SparkLR.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at scala.tools.nsc.util.ScalaClassLoader$$anonfun$run$1.apply(ScalaClassLoader.scala:78)
        at scala.tools.nsc.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:24)
        at scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:88)
        at scala.tools.nsc.util.ScalaClassLoader$class.run(ScalaClassLoader.scala:78)
        at scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:101)
        at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:33)
        at scala.tools.nsc.ObjectRunner$.runAndCatch(ObjectRunner.scala:40)
        at scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:56)
        at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:80)
        at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:89)
        at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
Caused by: java.lang.ClassNotFoundException: com.google.protobuf.ProtocolMessageEnum
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at sun.misc.Launcher$ExtClassLoader.findClass(Launcher.java:229)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
        ... 34 more
Best Regards 

Matei Zaharia

unread,
Mar 27, 2012, 5:13:09 PM3/27/12
to spark...@googlegroups.com
Hi Xianda,

First, you may need to use an older SVN revision of Mesos to work with the latest Spark. The current version in SVN has some incompatible API changes that we haven't updated Spark to work with yet. Check out revision 1205738 of Mesos as follows:

svn checkout -r 1205738 http://svn.apache.org/repos/asf/incubator/mesos/trunk mesos

After that, you ought to be able to compile it and it will run without any changes. Then to have Spark find the libraries, you need to create a file conf/spark-env.sh in your Spark directory (copy spark-env.sh.template that's there), and in it add export MESOS_HOME=<path to Mesos>.

These instructions are up at https://github.com/mesos/spark/wiki/Running-spark-on-mesos. Hope this helps.

Matei

Xianda Sun

unread,
Mar 28, 2012, 11:47:05 PM3/28/12
to spark...@googlegroups.com
Thank you for the help, Matei. But here I still have the same problem as described in the third post:
 
./run spark.examples.SparkLR master@hp01:5050
12/03/29 19:20:36 INFO spark.BoundedMemoryCache: BoundedMemoryCache.maxBytes = 339585269
12/03/29 19:20:36 INFO spark.CacheTrackerActor: Registered actor on port 7077
12/03/29 19:20:36 INFO spark.MapOutputTrackerActor: Registered actor on port 7077
12/03/29 19:20:36 INFO spark.ShuffleManager: Shuffle dir: /tmp/spark-local-9b7f01a7-3883-43e7-901d-1413946a2466/shuffle
12/03/29 19:20:36 INFO server.Server: jetty-7.5.3.v20111011
12/03/29 19:20:36 INFO server.AbstractConnector: Started SelectChann...@0.0.0.0:56184 STARTING
12/03/29 19:20:36 INFO spark.ShuffleManager: Local URI: http://10.0.1.201:56184
Here we are using Mesos in revision: 1205738 and Spark in commit 90418b70ffc314d1f7b2461cc90aca682e276d9d.
 
And we are sure that protobuf-java-2.4.1.jar is in $SPARK_HOME/lib_managed/.../ and the CLASSPATH in script 'run'
also includes this jar file.
 
Thanks,

Matei Zaharia

unread,
Mar 28, 2012, 11:49:27 PM3/28/12
to spark...@googlegroups.com
Actually, I think you should use protobuf 2.3.0 for that version of Mesos and Spark. Did you add 2.4.1 in there manually?

Matei

Xianda Sun

unread,
Mar 29, 2012, 12:57:08 AM3/29/12
to spark...@googlegroups.com
Yes, I thought it is perhaps because of the protobuf's version, but it results in the same error when 2.3.0 is used.
 
Thanks

Matei Zaharia

unread,
Mar 29, 2012, 1:04:52 AM3/29/12
to spark...@googlegroups.com
Looks like there was a bug that caused protobuf 2.4.1 to be downloaded anyway because a third-party library depended on it. I guess it just didn't get put ahead of 2.3.0 on my path when I tested it. I've fixed that now, so do a git pull and then sbt/sbt clean compile in Spark and it should work. Sorry about that!

Matei

Xianda Sun

unread,
Mar 29, 2012, 6:31:23 AM3/29/12
to spark...@googlegroups.com
Hi Matei,
 
Thanks for the update, but it seems that still the classLoader cannot find protobuf package.
 
Is it because 64-bit jvm does not support protobuf, our machines are CentOS 64bit (jdk:1.6.31). The error remains unchanged whether  or not protobuf-2.3.0.jar is in $SPARK_HOME/lib_managed/jars/com.google.protobuf/protobuf-java.
 
Thanks

Matei Zaharia

unread,
Mar 29, 2012, 11:12:10 AM3/29/12
to spark...@googlegroups.com
No, there should be no issue with 64-bit. Almost everyone I know is using it. Try doing a clean checkout of both Spark and Mesos and rebuilding both of them. Maybe something is confused due to your earlier changes. You should not need to modify any code or place any JARs in there manually -- just download Mesos revision 1205738 and the master branch of Spark.

Matei

Matei Zaharia

unread,
Mar 29, 2012, 11:20:46 AM3/29/12
to spark...@googlegroups.com
By the way, another thing you can do is edit the script called "run" in the Spark directory (./run) and at the bottom, just before it calls scala, add:

echo $CLASSPATH

This way you'll see where other JARs are coming from (e.g. is there a newer protobuf JAR on there), or other possible problems (e.g. spaces in the paths).

Matei

Xianda Sun

unread,
Mar 30, 2012, 11:20:10 PM3/30/12
to spark...@googlegroups.com
Hi Matei,
 
Thank you for your help, we excluded the machine hp01 with "no protobuf" problem and the system works now.
 
But we don't know why that machine is an exception, as the mesos and sparks on all these machines are copies of the version
compiled on hp01.
 
Best Regards

Matei Zaharia

unread,
Mar 30, 2012, 11:22:35 PM3/30/12
to spark...@googlegroups.com
Great. To debug that, try adding that echo $CLASSPATH line to the <spark_home>/run script in that machine, and let me know what it prints. You'll see its output in the "stdout" file created in the Mesos work directory for that machine (usually something like <mesos_home>/work/slave-<id>/framework-<id>/0/stdout).

Matei

Xianda Sun

unread,
Mar 30, 2012, 11:45:16 PM3/30/12
to spark...@googlegroups.com
OK, I found an stderr and an stdout in hp01's 0 directory, and the files are attached.
 
And although it says scala: not found in stderr, I can actually run scala on each machine. (On other machines, there are also stderr saying scala: not found)
 
And when I try to submit the job in hp01 using run script, it prints:
 
/home/hadoop/SPARK/mesos:/home/hadoop/SPARK/spark/core/target/scala-2.9.1/classes:/home/hadoop/SPARK/mesos/lib/java/mesos.jar:/home/hadoop/SPARK/spark/conf:/home/hadoop/SPARK/spark/repl/target/scala-2.9.1/classes:/home/hadoop/SPARK/spark/examples/target/scala-2.9.1/classes:/home/hadoop/SPARK/spark/core/lib/mesos.jar:/home/hadoop/SPARK/spark/lib_managed/jars/ant/ant/ant-1.6.5.jar:/home/hadoop/SPARK/spark/lib_managed/jars/com.googlecode/kryo/kryo-1.04.jar:/home/hadoop/SPARK/spark/lib_managed/jars/com.googlecode/minlog/minlog-1.2.jar:/home/hadoop/SPARK/spark/lib_managed/jars/com.googlecode/reflectasm/reflectasm-1.01.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.scala-tools.testing/scalacheck_2.9.1/scalacheck_2.9.1-1.9.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.scala-tools.testing/test-interface/test-interface-0.5.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.slf4j/slf4j-api/slf4j-api-1.6.1.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.slf4j/slf4j-log4j12/slf4j-log4j12-1.6.1.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.scalatest/scalatest_2.9.1/scalatest_2.9.1-1.6.1.jar:/home/hadoop/SPARK/spark/lib_managed/jars/it.unimi.dsi/fastutil/fastutil-6.4.2.jar:/home/hadoop/SPARK/spark/lib_managed/jars/com.google.protobuf/protobuf-java/protobuf-java-2.3.0.jar:/home/hadoop/SPARK/spark/lib_managed/jars/concurrent/concurrent/concurrent-1.3.4.jar:/home/hadoop/SPARK/spark/lib_managed/jars/net.java.dev.jets3t/jets3t/jets3t-0.7.1.jar:/home/hadoop/SPARK/spark/lib_managed/jars/net.sf.kosmosfs/kfs/kfs-0.3.jar:/home/hadoop/SPARK/spark/lib_managed/jars/commons-el/commons-el/commons-el-1.0.jar:/home/hadoop/SPARK/spark/lib_managed/jars/hsqldb/hsqldb/hsqldb-1.8.0.10.jar:/home/hadoop/SPARK/spark/lib_managed/jars/com.google.code.findbugs/jsr305/jsr305-1.3.9.jar:/home/hadoop/SPARK/spark/lib_managed/jars/javax.servlet/servlet-api/servlet-api-2.5.jar:/home/hadoop/SPARK/spark/lib_managed/jars/junit/junit/junit-4.5.jar:/home/hadoop/SPARK/spark/lib_managed/jars/xmlenc/xmlenc/xmlenc-0.52.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.eclipse.jdt/core/core-3.1.1.jar:/home/hadoop/SPARK/spark/lib_managed/jars/colt/colt/colt-1.2.0.jar:/home/hadoop/SPARK/spark/lib_managed/jars/de.javakaffee/kryo-serializers/kryo-serializers-0.9.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.apache.hadoop/hadoop-core/hadoop-core-0.20.2.jar:/home/hadoop/SPARK/spark/lib_managed/jars/tomcat/jasper-compiler/jasper-compiler-5.5.12.jar:/home/hadoop/SPARK/spark/lib_managed/jars/tomcat/jasper-runtime/jasper-runtime-5.5.12.jar:/home/hadoop/SPARK/spark/lib_managed/jars/com.google.guava/guava/guava-11.0.1.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.eclipse.jetty/jetty-http/jetty-http-7.5.3.v20111011.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.eclipse.jetty/jetty-server/jetty-server-7.5.3.v20111011.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.eclipse.jetty/jetty-continuation/jetty-continuation-7.5.3.v20111011.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.eclipse.jetty/jetty-util/jetty-util-7.5.3.v20111011.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.eclipse.jetty/jetty-io/jetty-io-7.5.3.v20111011.jar:/home/hadoop/SPARK/spark/lib_managed/jars/asm/asm-all/asm-all-3.3.1.jar:/home/hadoop/SPARK/spark/lib_managed/jars/asm/asm/asm-3.2.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.mortbay.jetty/servlet-api-2.5/servlet-api-2.5-6.1.14.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.mortbay.jetty/jsp-api-2.1/jsp-api-2.1-6.1.14.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.mortbay.jetty/jsp-2.1/jsp-2.1-6.1.14.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.mortbay.jetty/jetty-util/jetty-util-6.1.14.jar:/home/hadoop/SPARK/spark/lib_managed/jars/org.mortbay.jetty/jetty/jetty-6.1.14.jar:/home/hadoop/SPARK/spark/lib_managed/jars/oro/oro/oro-2.0.8.jar:/home/hadoop/SPARK/spark/lib_managed/jars/commons-net/commons-net/commons-net-1.4.1.jar:/home/hadoop/SPARK/spark/lib_managed/jars/commons-codec/commons-codec/commons-codec-1.3.jar:/home/hadoop/SPARK/spark/lib_managed/jars/commons-httpclient/commons-httpclient/commons-httpclient-3.0.1.jar:/home/hadoop/SPARK/spark/lib_managed/jars/commons-cli/commons-cli/commons-cli-1.2.jar:/home/hadoop/SPARK/spark/lib_managed/jars/commons-logging/commons-logging/commons-logging-1.1.1.jar:/home/hadoop/SPARK/spark/lib_managed/bundles/com.ning/compress-lzf/compress-lzf-0.8.4.jar:/home/hadoop/SPARK/spark/lib_managed/bundles/log4j/log4j/log4j-1.2.16.jar:/home/hadoop/SPARK/spark/lib_managed/bundles/org.jboss.netty/netty/netty-3.2.6.Final.jar:/home/hadoop/SPARK/spark/repl/lib/scala-jline.jar:/home/hadoop/SPARK/spark/bagel/target/scala-2.9.1/classes
 It seems that it has found the jar.
 
Best Regards
stderr
stdout

Matei Zaharia

unread,
Mar 30, 2012, 11:51:24 PM3/30/12
to spark...@googlegroups.com
And although it says scala: not found in stderr, I can actually run scala on each machine. (On other machines, there are also stderr saying scala: not found)

This means that Scala isn't on your PATH by default in non-login shells. You need to set the SCALA_HOME environment variable in conf/spark-env.sh to tell it where Scala is installed. I'm surprised that it could work over Mesos at all if this happens.. maybe you had it set before?

Anyway, I'm kind of stumped on the protobuf issue, but I'll think more about it. My guess is still that one machine has older code than the others, or a few extra JARs left over.

Matei

Xianda Sun

unread,
Mar 31, 2012, 12:39:15 AM3/31/12
to spark...@googlegroups.com
The Scala error no longer exists as SCALA_HOME is added to spark-env.sh, I guess the reason why it works is that the mesos-master/slave are processes
started by the user, thus they are able to get access to SCALA_HOME, but I am not sure.
 
And it seems that hp01(the one with protobuf error) is kicked out after running a spark job (spark.example.SparkLR):
On iteration 1
12/04/01 04:51:40 INFO spark.SparkContext: Starting job...
12/04/01 04:51:40 INFO spark.CacheTracker: Registering RDD ID 1 with cache
12/04/01 04:51:40 INFO spark.CacheTrackerActor: Registering RDD 1 with 2 partitions
12/04/01 04:51:40 INFO spark.CacheTracker: Registering RDD ID 0 with cache
12/04/01 04:51:40 INFO spark.CacheTrackerActor: Registering RDD 0 with 2 partitions
12/04/01 04:51:40 INFO spark.CacheTrackerActor: Asked for current cache locations
12/04/01 04:51:40 INFO spark.MesosScheduler: Final stage: Stage 0
12/04/01 04:51:40 INFO spark.MesosScheduler: Parents of final stage: List()
12/04/01 04:51:40 INFO spark.MesosScheduler: Missing parents: List()
12/04/01 04:51:40 INFO spark.MesosScheduler: Submitting Stage 0, which has no missing parents
12/04/01 04:51:40 INFO spark.MesosScheduler: Got a job with 2 tasks
12/04/01 04:51:40 INFO spark.MesosScheduler: Registered as framework ID 201203311232-0-0001
12/04/01 04:51:40 INFO spark.MesosScheduler: Adding job with ID 0
12/04/01 04:51:40 INFO spark.SimpleJob: Starting task 0:0 as TID 0 on slave 201203311232-0-1: pc02 (preferred)
12/04/01 04:51:40 INFO spark.SimpleJob: Starting task 0:1 as TID 1 on slave 201203311232-0-5: HP01 (preferred)
12/04/01 04:51:40 INFO spark.SimpleJob: Lost TID 1 (task 0:1)
12/04/01 04:51:41 INFO spark.SimpleJob: Finished TID 0 (progress: 1/2)
12/04/01 04:51:41 INFO spark.MesosScheduler: Completed ResultTask(0, 0)
12/04/01 04:51:41 INFO spark.SimpleJob: Starting task 0:1 as TID 2 on slave 201203311232-0-1: pc02 (preferred)
12/04/01 04:51:42 INFO spark.SimpleJob: Finished TID 2 (progress: 2/2)
12/04/01 04:51:42 INFO spark.MesosScheduler: Completed ResultTask(0, 1)
12/04/01 04:51:42 INFO spark.SparkContext: Job finished in 2.008355 s
And when I "lsof -i" on hp01 after running the job, no process named mesos-slave is found.
 
Thank you for your help!

Xianda Sun

unread,
Mar 31, 2012, 12:42:51 AM3/31/12
to spark...@googlegroups.com
Okay, the ones who did not report missing SCALA_HOME worked, and those who misses didn't do the work.
Reply all
Reply to author
Forward
0 new messages