13/06/14 10:34:45 WARN spark.Utils: Your hostname, eduZai resolves to a loopback address: 127.0.1.1; using 172.20.62.231 instead (on interface eth0)
13/06/14 10:34:45 WARN spark.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
13/06/14 10:34:46 INFO slf4j.Slf4jEventHandler: Slf4jEventHandler started
13/06/14 10:34:46 INFO spark.SparkEnv: Registering BlockManagerMaster
13/06/14 10:34:46 INFO storage.MemoryStore: MemoryStore started with capacity 1161.6 MB.
13/06/14 10:34:46 INFO storage.DiskStore: Created local directory at /tmp/spark-local-20130614103446-fc9e
13/06/14 10:34:46 INFO network.ConnectionManager: Bound socket to port 39216 with id = ConnectionManagerId(eduZai,39216)
13/06/14 10:34:46 INFO storage.BlockManagerMaster: Trying to register BlockManager
13/06/14 10:34:46 INFO storage.BlockManagerMasterActor$BlockManagerInfo: Registering block manager eduZai:39216 with 1161.6 MB RAM
13/06/14 10:34:46 INFO storage.BlockManagerMaster: Registered BlockManager
13/06/14 10:34:46 INFO server.Server: jetty-7.6.8.v20121106
13/06/14 10:34:46 INFO spark.SparkEnv: Registering MapOutputTracker
13/06/14 10:34:46 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-83c3f75b-c12d-44e9-aeae-86ac37c95996
13/06/14 10:34:46 INFO server.Server: jetty-7.6.8.v20121106
13/06/14 10:34:46 INFO io.IoWorker: IoWorker thread 'spray-io-worker-0' started
13/06/14 10:34:47 INFO server.HttpServer: akka://spark/user/BlockManagerHTTPServer started on /
0.0.0.0:5105013/06/14 10:34:47 INFO storage.BlockManagerUI: Started BlockManager web UI at
http://eduZai:5105013/06/14 10:34:47 INFO spark.SparkContext: Starting job: foreach at FrameworkTestSpark.java:45
13/06/14 10:34:47 INFO scheduler.DAGScheduler: Got job 0 (foreach at FrameworkTestSpark.java:45) with 4 output partitions (allowLocal=false)
13/06/14 10:34:47 INFO scheduler.DAGScheduler: Final stage: Stage 0 (parallelize at FrameworkSchedulerForSpark.java:44)
13/06/14 10:34:47 INFO scheduler.DAGScheduler: Parents of final stage: List()
13/06/14 10:34:47 INFO scheduler.DAGScheduler: Missing parents: List()
13/06/14 10:34:47 INFO scheduler.DAGScheduler: Submitting Stage 0 (ParallelCollectionRDD[0] at parallelize at FrameworkTestSpark.java:44), which has no missing parents
13/06/14 10:34:47 INFO scheduler.DAGScheduler: Submitting 4 missing tasks from Stage 0 (ParallelCollectionRDD[0] at parallelize at FrameworkTestSpark.java:44)
13/06/14 10:34:47 INFO local.LocalScheduler: Running ResultTask(0, 2)
13/06/14 10:34:47 INFO local.LocalScheduler: Running ResultTask(0, 0)
13/06/14 10:34:47 INFO local.LocalScheduler: Running ResultTask(0, 1)
13/06/14 10:34:47 INFO local.LocalScheduler: Running ResultTask(0, 3)
13/06/14 10:34:47 ERROR local.LocalScheduler: Exception in task 0
java.io.NotSerializableException: FrameworkTestSpark
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1180)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1528)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1493)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1416)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1528)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1493)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1416)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1528)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1493)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1416)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1528)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1493)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1416)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:346)
at spark.JavaSerializationStream.writeObject(JavaSerializer.scala:11)
at spark.scheduler.ResultTask$.serializeInfo(ResultTask.scala:27)
at spark.scheduler.ResultTask.writeExternal(ResultTask.scala:91)
at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1443)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1414)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:346)
at spark.JavaSerializationStream.writeObject(JavaSerializer.scala:11)
at spark.JavaSerializerInstance.serialize(JavaSerializer.scala:31)
at spark.scheduler.Task$.serializeWithDependencies(Task.scala:61)
at spark.scheduler.local.LocalScheduler.runTask$1(LocalScheduler.scala:66)
at spark.scheduler.local.LocalScheduler$$anon$1.run(LocalScheduler.scala:49)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:722)