spark-executor in a cluster without mesos

148 views
Skip to first unread message

Giovanni Comarela

unread,
May 14, 2013, 7:19:20 PM5/14/13
to spark...@googlegroups.com
Hi,

probably a simple question.

I have spark deployed in a cluster without Mesos.

I can use spark-shell with no problems, reading files from HDFS.

However when I try to use spark-executor I have the following problem

Failed to load native Mesos library from
Exception in thread "main" java.lang.UnsatisfiedLinkError: no mesos in java.library.path
    at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1860)
    at java.lang.Runtime.loadLibrary0(Runtime.java:845)
    at java.lang.System.loadLibrary(System.java:1084)
    at org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:46)
    at spark.executor.MesosExecutorBackend$.main(MesosExecutorBackend.scala:69)
    at spark.executor.MesosExecutorBackend.main(MesosExecutorBackend.scala)

This is expected since I am not using Mesos. My question is: can I run spark-executor in a cluster without mesos? If so, how should I modify the script?
My spark-executor looks like this:

#!/bin/sh
FWDIR="`dirname $0`"
echo "Running spark-executor with framework dir = $FWDIR"
exec $FWDIR/run spark.executor.MesosExecutorBackend

Thanks,

Giovanni Comarela

Josh Rosen

unread,
May 14, 2013, 7:28:35 PM5/14/13
to spark...@googlegroups.com
Why are you trying to run `spark-executor`?  That script isn't meant to be used without Mesos.

If you want to run Spark on a cluster without using Mesos or YARN, you should use Spark's Standalone Mode (this was introduced in Spark 0.7): http://spark-project.org/docs/latest/spark-standalone.html

Giovanni

unread,
May 14, 2013, 7:57:29 PM5/14/13
to spark...@googlegroups.com
Thanks,

Spark beginner here. I knew it was simple!

Regards,

Giovanni Comarela


--
You received this message because you are subscribed to a topic in the Google Groups "Spark Users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/spark-users/VOUCRVAL8Yk/unsubscribe?hl=en.
To unsubscribe from this group and all its topics, send an email to spark-users...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

Reply all
Reply to author
Forward
0 new messages