spark-shell hangs

787 views
Skip to first unread message

James Donahue

unread,
Nov 6, 2012, 12:50:48 PM11/6/12
to spark...@googlegroups.com
 

I launched a Master and a Worker and can see through the webUI that both are alive.

 

Then, I go to a different machine and try:

 

    MASTER=<spark host> ./spark-shell

 

and get the following:

[ec2-user@ip-10-72-131-182 spark]$ MASTER=10.72.249.233:7077 ./spark-shell
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 0.7.0
      /_/

Using Scala version 2.9.2 (OpenJDK 64-Bit Server VM, Java 1.6.0_24)
Initializing interpreter...
12/11/06 17:37:56 INFO server.Server: jetty-7.5.3.v20111011
12/11/06 17:37:56 INFO server.AbstractConnector: Started SelectChann...@0.0.0.0:45643 STARTING
Creating SparkContext...
12/11/06 17:38:06 INFO slf4j.Slf4jEventHandler: Slf4jEventHandler started
12/11/06 17:38:06 INFO storage.BlockManagerMaster: Registered BlockManagerMaster Actor
12/11/06 17:38:06 INFO storage.MemoryStore: MemoryStore started with capacity 323.9 MB.
12/11/06 17:38:06 INFO storage.DiskStore: Created local directory at /tmp/spark-local-20121106173806-a6bc
12/11/06 17:38:06 INFO network.ConnectionManager: Bound socket to port 58706 with id = ConnectionManagerId(ip-10-72-131-
182,58706)
12/11/06 17:38:06 INFO storage.BlockManagerMaster: Trying to register BlockManager
12/11/06 17:38:06 INFO storage.BlockManagerMasterActor: Got Register Msg from master node, don't register it
12/11/06 17:38:06 INFO storage.BlockManagerMaster: BlockManager registered successfully @ syncRegisterBlockManager
12/11/06 17:38:06 INFO storage.BlockManagerMaster: Done registering BlockManager
12/11/06 17:38:06 INFO server.Server: jetty-7.5.3.v20111011
12/11/06 17:38:06 INFO server.AbstractConnector: Started SelectChann...@0.0.0.0:33844 STARTING
12/11/06 17:38:06 INFO broadcast.HttpBroadcast: Broadcast server started at http://10.72.131.182:33844
12/11/06 17:38:06 INFO spark.CacheTracker: Registered CacheTrackerActor actor
12/11/06 17:38:06 INFO spark.MapOutputTracker: Registered MapOutputTrackerActor actor
12/11/06 17:38:06 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-934cbae5-b294-4be1-be9a-44a8022f4f
ed
12/11/06 17:38:06 INFO server.Server: jetty-7.5.3.v20111011
12/11/06 17:38:06 INFO server.AbstractConnector: Started SelectChann...@0.0.0.0:52376 STARTING

 

at this point the shell hangs -- when I hit CTRL-C, I get the following:

java.lang.InterruptedException
        at java.lang.Object.wait(Native Method)
        at java.lang.Object.wait(Object.java:502)
        at spark.scheduler.mesos.MesosSchedulerBackend.waitForRegister(MesosSchedulerBackend.scala:142)
        at spark.scheduler.mesos.MesosSchedulerBackend.start(MesosSchedulerBackend.scala:74)
        at spark.scheduler.cluster.ClusterScheduler.start(ClusterScheduler.scala:68)
        at spark.SparkContext.<init>(SparkContext.scala:186)
        at spark.SparkContext.<init>(SparkContext.scala:82)
        at spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:841)
        at <init>(<console>:10)
        at <init>(<console>:22)
        at <init>(<console>:24)
        at .<init>(<console>:28)
        at .<clinit>(<console>)
        at .<init>(<console>:7)
        at .<clinit>(<console>)
        at $export(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:616)
        at spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:629)
        at spark.repl.SparkIMain$Request$$anonfun$10.apply(SparkIMain.scala:890)
        at scala.tools.nsc.interpreter.Line$$anonfun$1.apply$mcV$sp(Line.scala:43)
        at scala.tools.nsc.io.package$$anon$2.run(package.scala:25)
        at java.lang.Thread.run(Thread.java:679)
Type in expressions to have them evaluated.
Type :help for more information.

scala> exit

 

I don't see anything in the work directory on the master...

 

 

Jim Donahue

Adobe Systems

Mark Hamstra

unread,
Nov 6, 2012, 2:25:37 PM11/6/12
to spark...@googlegroups.com
Try...

[ec2-user@ip-10-72-131-182 spark]$ MASTER=spark://10.72.249.233:7077 ./spark-shell
Reply all
Reply to author
Forward
0 new messages