Can't find mesos library when running standalone cluster...

258 views
Skip to first unread message

paychex

unread,
Dec 6, 2012, 7:06:15 PM12/6/12
to spark...@googlegroups.com
The cluster we are running is on standalone mode and not using Mesos management however when we are trying to run the example "spark.examples.SparkPi 192.168.1.35[2]" it is returning the below error. 
After the error, we tried to compile the mesos.0.9.0 it always complains about JAVA_HOME. We are using JDK1.7. Please help.

== error from running spark.examples.SparkPi
Failed to load native Mesos library from
Exception in thread "main" java.lang.UnsatisfiedLinkError: no mesos in java.library.path
        at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1856)
        at java.lang.Runtime.loadLibrary0(Runtime.java:845)
        at java.lang.System.loadLibrary(System.java:1084)
        at org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:46)
        at spark.SparkContext.<init>(SparkContext.scala:174)
        at spark.SparkContext.<init>(SparkContext.scala:83)
        at spark.examples.SparkPi$.main(SparkPi.scala:13)
        at spark.examples.SparkPi.main(SparkPi.scala)

== error from compiling mesos
checking whether or not we can build using the JDK... ./libtool: line 1104: g++: command not found
configure: error: failed to build against JDK (using libtool)

Matei Zaharia

unread,
Dec 6, 2012, 7:09:50 PM12/6/12
to spark...@googlegroups.com
The cluster URL you used is incorrect for a standalone cluster. It should look like spark://host:port. The master for the standalone mode prints the right URL (and shows it on its web UI).

Matei

Kalvin Lee

unread,
Dec 6, 2012, 7:33:42 PM12/6/12
to spark...@googlegroups.com

I tried running it using spark://ip:port and it says invalid url. Then I tried ip:port and stated the same error when running the SparkPi example.

Matei Zaharia

unread,
Dec 6, 2012, 8:21:49 PM12/6/12
to spark...@googlegroups.com
Maybe it need to be host:port. Use exactly what the master prints when you run spark.deploy.master.Master. Or look at its web UI (localhost:8080) to see the URL there.

Matei

paychex

unread,
Dec 6, 2012, 8:56:41 PM12/6/12
to spark...@googlegroups.com
Matei,
Thank you for your quick response.
Let me try to clarify the issue.
The master and slave standalone mode is up and running.
The problem is when we are trying to run the example code, spark.examples.SparkPi

The spark.examples.SparkPi local[2] works OK but spark.examples.SparkPi 192.168.1.45[1] doesn't work. It seems when running the example code on local works but when trying to run it against the cluster it fails with the errors.

Matei Zaharia

unread,
Dec 6, 2012, 9:02:39 PM12/6/12
to spark...@googlegroups.com
Yes, as I said, "192.168.1.45[1]" is not a valid cluster URL. It is trying to parse it as a Mesos URL, but even if you had the Mesos native library installed, that wouldn't work. You need to use the spark:// URL printed by the standalone mode master, including the "spark://" part at the beginning. Take a look at http://www.spark-project.org/docs/latest/spark-standalone.html and walk through it manually to see how to use the standalone deploy mode.

Matei

paychex

unread,
Dec 7, 2012, 12:44:06 AM12/7/12
to spark...@googlegroups.com
Got it. Thank You.
Reply all
Reply to author
Forward
0 new messages