SparkNet on local cluster

172 views
Skip to first unread message

MªLuz Morales

unread,
Feb 24, 2016, 5:35:24 AM2/24/16
to sparknet-users
Hi,
I'm trying to replicate the Cifar10 example of GigHub on my local cluster. I'm get this error:

Warning: Local jar /home/ubuntu/SparkNet-master/target/scala-2.10/sparknet-assembly-0.1-SNAPSHOT.jar does not exist, skipping.

I downloaded the SparkNet-master.zip available in GitHub. But SparkNet-master did not contain a target and therefore, did not contain the .jar file. 

How can I do that works??

Thanks
Regards

Robert Nishihara

unread,
Feb 24, 2016, 1:20:46 PM2/24/16
to MªLuz Morales, sparknet-users
Looks like the project hasn't been compiled. Try

    cd SparkNet/
    sbt assembly

That should build the project. You will need sbt and some other things installed. There are some instructions for that in the README https://github.com/amplab/SparkNet/blob/master/README.md in the section "Building your own AMI". Let me know if that works or if you run into other problems.

--
You received this message because you are subscribed to the Google Groups "sparknet-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to sparknet-user...@googlegroups.com.
To post to this group, send email to sparkne...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/sparknet-users/19d7dbe6-8eb9-4472-9fa5-c434b59184f6%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Robert Nishihara

unread,
Mar 2, 2016, 3:21:35 AM3/2/16
to MªLuz Morales, sparknet-users
+sparknet-users so we have a record

On Mon, Feb 29, 2016 at 11:50 PM MªLuz Morales <mlz...@gmail.com> wrote:
Hi,
I'm using Ubuntu Server 14.04.4 LTS 64 bits on a 4 nodes cluster of a virtual machine installed on Windows 8.1 (PC: Intel Core i7-4720HQ CPU 2.60 GHz 2.60 GHz; 16 GB RAM, OS 64 bits).

This is the full message I get when I do sbt assembly:
[info] Loading project definition from /home/ubuntu/SparkNet/project
[info] Set current project to sparknet (in build file:/home/ubuntu/SparkNet/)
[warn] Multiple main classes detected.  Run 'show discoveredMainClasses' to see the list
[info] Including from cache: javacpp-1.2-SNAPSHOT.jar
[info] Including from cache: aws-java-sdk-emr-1.10.21.jar
[info] Including from cache: aws-java-sdk-elasticache-1.10.21.jar
[info] Including from cache: aws-java-sdk-elastictranscoder-1.10.21.jar
[info] Including from cache: aws-java-sdk-ec2-1.10.21.jar
[info] Including from cache: caffe-master-1.2-SNAPSHOT-linux-x86_64.jar
[info] Including from cache: caffe-master-1.2-SNAPSHOT.jar
[info] Including from cache: aws-java-sdk-dynamodb-1.10.21.jar
[info] Including from cache: opencv-3.1.0-1.2-SNAPSHOT.jar
[info] Including from cache: aws-java-sdk-cloudtrail-1.10.21.jar
[info] Including from cache: aws-java-sdk-cloudwatch-1.10.21.jar
[info] Including from cache: aws-java-sdk-logs-1.10.21.jar
[info] Including from cache: aws-java-sdk-cognitoidentity-1.10.21.jar
[info] Including from cache: aws-java-sdk-cognitosync-1.10.21.jar
[info] Including from cache: aws-java-sdk-directconnect-1.10.21.jar
[info] Including from cache: aws-java-sdk-cloudformation-1.10.21.jar
[info] Including from cache: aws-java-sdk-cloudfront-1.10.21.jar
[info] Including from cache: opencv-3.1.0-1.2-SNAPSHOT-linux-x86_64.jar
[info] Including from cache: aws-java-sdk-kinesis-1.10.21.jar
[info] Including from cache: aws-java-sdk-opsworks-1.10.21.jar
[info] Including from cache: protobuf-java-2.5.0.jar
[info] Including from cache: spark-csv_2.11-1.3.0.jar
[info] Including from cache: aws-java-sdk-ses-1.10.21.jar
[info] Including from cache: aws-java-sdk-autoscaling-1.10.21.jar
[info] Including from cache: aws-java-sdk-cloudsearch-1.10.21.jar
[info] Including from cache: aws-java-sdk-cloudwatchmetrics-1.10.21.jar
[info] Including from cache: aws-java-sdk-swf-libraries-1.10.21.jar
[info] Including from cache: aws-java-sdk-codedeploy-1.10.21.jar
[info] Including from cache: aws-java-sdk-codepipeline-1.10.21.jar
[info] Including from cache: aws-java-sdk-config-1.10.21.jar
[info] Including from cache: aws-java-sdk-lambda-1.10.21.jar
[info] Including from cache: aws-java-sdk-ecs-1.10.21.jar
[info] Including from cache: aws-java-sdk-cloudhsm-1.10.21.jar
[info] Including from cache: aws-java-sdk-ssm-1.10.21.jar
[info] Including from cache: aws-java-sdk-workspaces-1.10.21.jar
[info] Including from cache: scala-library-2.11.7.jar
[info] Including from cache: commons-csv-1.1.jar
[info] Including from cache: aws-java-sdk-machinelearning-1.10.21.jar
[info] Including from cache: aws-java-sdk-directory-1.10.21.jar
[info] Including from cache: aws-java-sdk-efs-1.10.21.jar
[info] Including from cache: univocity-parsers-1.5.1.jar
[info] Including from cache: aws-java-sdk-codecommit-1.10.21.jar
[info] Including from cache: jna-4.2.1.jar
[info] Including from cache: aws-java-sdk-1.10.21.jar
[info] Including from cache: aws-java-sdk-devicefarm-1.10.21.jar
[info] Including from cache: thumbnailator-0.4.2.jar
[info] Including from cache: common-lang-3.1.2.jar
[info] Including from cache: common-io-3.1.2.jar
[info] Including from cache: common-image-3.1.2.jar
[info] Including from cache: aws-java-sdk-support-1.10.21.jar
[info] Including from cache: imageio-jpeg-3.1.2.jar
[info] Including from cache: aws-java-sdk-core-1.10.21.jar
[info] Including from cache: commons-logging-1.1.3.jar
[info] Including from cache: imageio-metadata-3.1.2.jar
[info] Including from cache: imageio-core-3.1.2.jar
[info] Including from cache: jackson-annotations-2.4.0.jar
[info] Including from cache: httpclient-4.3.6.jar
[info] Including from cache: httpcore-4.3.3.jar
[info] Including from cache: commons-codec-1.6.jar
[info] Including from cache: jackson-core-2.4.4.jar
[info] Including from cache: joda-time-2.8.1.jar
[info] Including from cache: jackson-databind-2.4.4.jar
[info] Including from cache: aws-java-sdk-simpledb-1.10.21.jar
[info] Including from cache: aws-java-sdk-sts-1.10.21.jar
[info] Including from cache: aws-java-sdk-sqs-1.10.21.jar
[info] Including from cache: aws-java-sdk-rds-1.10.21.jar
[info] Including from cache: aws-java-sdk-simpleworkflow-1.10.21.jar
[info] Including from cache: aws-java-sdk-storagegateway-1.10.21.jar
[info] Including from cache: aws-java-sdk-redshift-1.10.21.jar
[info] Including from cache: aws-java-sdk-elasticbeanstalk-1.10.21.jar
[info] Including from cache: aws-java-sdk-route53-1.10.21.jar
[info] Including from cache: aws-java-sdk-glacier-1.10.21.jar
[info] Including from cache: aws-java-sdk-s3-1.10.21.jar
[info] Including from cache: aws-java-sdk-kms-1.10.21.jar
[info] Including from cache: aws-java-sdk-sns-1.10.21.jar
[info] Including from cache: aws-java-sdk-importexport-1.10.21.jar
[info] Including from cache: aws-java-sdk-iam-1.10.21.jar
[info] WeightCollectionSpec:
[info] Including from cache: aws-java-sdk-datapipeline-1.10.21.jar
[info] Including from cache: aws-java-sdk-elasticloadbalancing-1.10.21.jar
flatCopy() took 0.123s
flatCopyFast() took 0.001s
[info] NDArraySpec:
[info] CaffeNetSpec:
[info] NetParam

java.lang.UnsatisfiedLinkError: no jniopencv_core in java.library.path
        at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1886)
        at java.lang.Runtime.loadLibrary0(Runtime.java:849)
        at java.lang.System.loadLibrary(System.java:1088)
        at org.bytedeco.javacpp.Loader.loadLibrary(Loader.java:629)
        at org.bytedeco.javacpp.Loader.load(Loader.java:467)
        at org.bytedeco.javacpp.Loader.load(Loader.java:404)
        at org.bytedeco.javacpp.opencv_core.<clinit>(opencv_core.java:10)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:278)
        at org.bytedeco.javacpp.Loader.load(Loader.java:439)
        at org.bytedeco.javacpp.Loader.load(Loader.java:404)
        at org.bytedeco.javacpp.caffe$NetParameter.<clinit>(caffe.java:1940)
        at CaffeNetSpec$$anonfun$1.apply$mcV$sp(CaffeNetSpec.scala:13)
        at CaffeNetSpec$$anonfun$1.apply(CaffeNetSpec.scala:12)
        at CaffeNetSpec$$anonfun$1.apply(CaffeNetSpec.scala:12)
        at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
        at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
        at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
        at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
        at org.scalatest.Transformer.apply(Transformer.scala:22)
        at org.scalatest.Transformer.apply(Transformer.scala:20)
        at org.scalatest.FlatSpecLike$$anon$1.apply(FlatSpecLike.scala:1636)
        at org.scalatest.Suite$class.withFixture(Suite.scala:1121)
        at org.scalatest.FlatSpec.withFixture(FlatSpec.scala:1683)
        at org.scalatest.FlatSpecLike$class.invokeWithFixture$1(FlatSpecLike.scala:1633)
        at org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1645)
        at org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1645)
        at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
        at org.scalatest.FlatSpecLike$class.runTest(FlatSpecLike.scala:1645)
        at org.scalatest.FlatSpec.runTest(FlatSpec.scala:1683)
        at org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1703)
        at org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1703)
        at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
        at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
        at scala.collection.immutable.List.foreach(List.scala:318)
        at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
        at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:390)
        at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:427)
        at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
        at scala.collection.immutable.List.foreach(List.scala:318)
        at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
        at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
        at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
        at org.scalatest.FlatSpecLike$class.runTests(FlatSpecLike.scala:1703)
        at org.scalatest.FlatSpec.runTests(FlatSpec.scala:1683)
        at org.scalatest.Suite$class.run(Suite.scala:1423)
        at org.scalatest.FlatSpec.org$scalatest$FlatSpecLike$$super$run(FlatSpec.scala:1683)
        at org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1749)
        at org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1749)
        at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
        at org.scalatest.FlatSpecLike$class.run(FlatSpecLike.scala:1749)
        at org.scalatest.FlatSpec.run(FlatSpec.scala:1683)
        at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:444)
        at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:651)

        at sbt.TestRunner.runTest$1(TestFramework.scala:76)
        at sbt.TestRunner.run(TestFramework.scala:85)
        at sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:202)
        at sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:202)
        at sbt.TestFramework$.sbt$TestFramework$$withContextLoader(TestFramework.scala:185)
        at sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:202)
        at sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:202)
        at sbt.TestFunction.apply(TestFramework.scala:207)
        at sbt.Tests$.sbt$Tests$$processRunnable$1(Tests.scala:239)
        at sbt.Tests$$anonfun$makeSerial$1.apply(Tests.scala:245)
        at sbt.Tests$$anonfun$makeSerial$1.apply(Tests.scala:245)
        at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:44)
        at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:44)
        at sbt.std.Transform$$anon$4.work(System.scala:63)
        at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
        at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
        at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
        at sbt.Execute.work(Execute.scala:237)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
        at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
        at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.UnsatisfiedLinkError: /tmp/javacpp497121955371/libjniopencv_core.so: /usr/lib/x86_64-linux-gnu/libgomp.so.1: version `GOMP_4.0' not found (required by /tmp/javacpp497121955371/libopencv_core.so.3.1)
        at java.lang.ClassLoader$NativeLibrary.load(Native Method)
        at java.lang.ClassLoader.loadLibrary1(ClassLoader.java:1965)
        at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1890)
        at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1851)
        at java.lang.Runtime.load0(Runtime.java:795)
        at java.lang.System.load(System.java:1062)
        at org.bytedeco.javacpp.Loader.loadLibrary(Loader.java:612)
        at org.bytedeco.javacpp.Loader.load(Loader.java:467)
        at org.bytedeco.javacpp.Loader.load(Loader.java:404)
        at org.bytedeco.javacpp.opencv_core.<clinit>(opencv_core.java:10)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:278)
        at org.bytedeco.javacpp.Loader.load(Loader.java:439)
        at org.bytedeco.javacpp.Loader.load(Loader.java:404)
        at org.bytedeco.javacpp.caffe$NetParameter.<clinit>(caffe.java:1940)
        at CaffeNetSpec$$anonfun$1.apply$mcV$sp(CaffeNetSpec.scala:13)
        at CaffeNetSpec$$anonfun$1.apply(CaffeNetSpec.scala:12)
        at CaffeNetSpec$$anonfun$1.apply(CaffeNetSpec.scala:12)
        at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
        at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
        at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
        at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
        at org.scalatest.Transformer.apply(Transformer.scala:22)
        at org.scalatest.Transformer.apply(Transformer.scala:20)
        at org.scalatest.FlatSpecLike$$anon$1.apply(FlatSpecLike.scala:1636)
        at org.scalatest.Suite$class.withFixture(Suite.scala:1121)
        at org.scalatest.FlatSpec.withFixture(FlatSpec.scala:1683)
        at org.scalatest.FlatSpecLike$class.invokeWithFixture$1(FlatSpecLike.scala:1633)
        at org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1645)
        at org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1645)
        at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
        at org.scalatest.FlatSpecLike$class.runTest(FlatSpecLike.scala:1645)
        at org.scalatest.FlatSpec.runTest(FlatSpec.scala:1683)
        at org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1703)
        at org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1703)
        at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
        at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
        at scala.collection.immutable.List.foreach(List.scala:318)
        at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
        at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:390)
        at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:427)
        at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
        at scala.collection.immutable.List.foreach(List.scala:318)
        at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
        at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
        at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
        at org.scalatest.FlatSpecLike$class.runTests(FlatSpecLike.scala:1703)
        at org.scalatest.FlatSpec.runTests(FlatSpec.scala:1683)
        at org.scalatest.Suite$class.run(Suite.scala:1423)
        at org.scalatest.FlatSpec.org$scalatest$FlatSpecLike$$super$run(FlatSpec.scala:1683)
        at org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1749)
        at org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1749)
        at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
        at org.scalatest.FlatSpecLike$class.run(FlatSpecLike.scala:1749)
        at org.scalatest.FlatSpec.run(FlatSpec.scala:1683)
        at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:444)
        at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:651)

        at sbt.TestRunner.runTest$1(TestFramework.scala:76)
        at sbt.TestRunner.run(TestFramework.scala:85)
        at sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:202)
        at sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:202)
        at sbt.TestFramework$.sbt$TestFramework$$withContextLoader(TestFramework.scala:185)
        at sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:202)
        at sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:202)
        at sbt.TestFunction.apply(TestFramework.scala:207)
        at sbt.Tests$.sbt$Tests$$processRunnable$1(Tests.scala:239)
        at sbt.Tests$$anonfun$makeSerial$1.apply(Tests.scala:245)
        at sbt.Tests$$anonfun$makeSerial$1.apply(Tests.scala:245)
        at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:44)
        at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:44)
        at sbt.std.Transform$$anon$4.work(System.scala:63)
        at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
        at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
        at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
        at sbt.Execute.work(Execute.scala:237)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
        at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
        at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
[error] Could not run test CaffeNetSpec: java.lang.UnsatisfiedLinkError: no jniopencv_core in java.library.path
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/03/01 08:36:15 INFO SparkContext: Running Spark version 1.4.1
16/03/01 08:36:16 INFO SecurityManager: Changing view acls to: ubuntu
16/03/01 08:36:16 INFO SecurityManager: Changing modify acls to: ubuntu
16/03/01 08:36:16 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(ubuntu); users with modify permissions: Set(ubuntu)
16/03/01 08:36:16 INFO Slf4jLogger: Slf4jLogger started
16/03/01 08:36:16 INFO Remoting: Starting remoting
16/03/01 08:36:16 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://spark...@192.168.56.4:46236]
16/03/01 08:36:16 INFO Utils: Successfully started service 'sparkDriver' on port 46236.
16/03/01 08:36:16 INFO SparkEnv: Registering MapOutputTracker
16/03/01 08:36:16 INFO SparkEnv: Registering BlockManagerMaster
16/03/01 08:36:16 INFO DiskBlockManager: Created local directory at /tmp/spark-d59ce13b-ce92-4a6b-9f48-309abf18c6a4/blockmgr-448725ff-1b71-4d13-afa2-91df5e250370
16/03/01 08:36:16 INFO MemoryStore: MemoryStore started with capacity 530.3 MB
16/03/01 08:36:17 INFO HttpFileServer: HTTP File server directory is /tmp/spark-d59ce13b-ce92-4a6b-9f48-309abf18c6a4/httpd-a1120858-fef8-47db-8db7-91b723a1ca82
16/03/01 08:36:17 INFO HttpServer: Starting HTTP Server
16/03/01 08:36:17 INFO Utils: Successfully started service 'HTTP file server' on port 55073.
16/03/01 08:36:17 INFO SparkEnv: Registering OutputCommitCoordinator
16/03/01 08:36:17 INFO Utils: Successfully started service 'SparkUI' on port 4040.
16/03/01 08:36:17 INFO SparkUI: Started SparkUI at http://192.168.56.4:4040
16/03/01 08:36:17 INFO Executor: Starting executor ID driver on host localhost
16/03/01 08:36:17 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 41014.
16/03/01 08:36:17 INFO NettyBlockTransferService: Server created on 41014
16/03/01 08:36:17 INFO BlockManagerMaster: Trying to register BlockManager
16/03/01 08:36:17 INFO BlockManagerMasterEndpoint: Registering block manager localhost:41014 with 530.3 MB RAM, BlockManagerId(driver, localhost, 41014)
16/03/01 08:36:17 INFO BlockManagerMaster: Registered BlockManager
16/03/01 08:36:18 INFO SparkContext: Starting job: take at PreprocessorSpec.scala:34
16/03/01 08:36:18 INFO DAGScheduler: Got job 0 (take at PreprocessorSpec.scala:34) with 1 output partitions (allowLocal=false)
16/03/01 08:36:18 INFO DAGScheduler: Final stage: ResultStage 0(take at PreprocessorSpec.scala:34)
16/03/01 08:36:18 INFO DAGScheduler: Parents of final stage: List()
16/03/01 08:36:18 INFO DAGScheduler: Missing parents: List()
16/03/01 08:36:18 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[2] at take at PreprocessorSpec.scala:34), which has no missing parents
16/03/01 08:36:18 INFO MemoryStore: ensureFreeSpace(2984) called with curMem=0, maxMem=556038881
16/03/01 08:36:18 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 2.9 KB, free 530.3 MB)
16/03/01 08:36:18 INFO MemoryStore: ensureFreeSpace(1703) called with curMem=2984, maxMem=556038881
16/03/01 08:36:18 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1703.0 B, free 530.3 MB)
16/03/01 08:36:18 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:41014 (size: 1703.0 B, free: 530.3 MB)
16/03/01 08:36:18 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:874
16/03/01 08:36:18 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[2] at take at PreprocessorSpec.scala:34)
16/03/01 08:36:18 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
16/03/01 08:36:18 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, PROCESS_LOCAL, 1579 bytes)
16/03/01 08:36:18 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
16/03/01 08:36:18 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1761 bytes result sent to driver
16/03/01 08:36:18 INFO DAGScheduler: ResultStage 0 (take at PreprocessorSpec.scala:34) finished in 0,062 s
16/03/01 08:36:18 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 61 ms on localhost (1/1)
16/03/01 08:36:18 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
16/03/01 08:36:18 INFO DAGScheduler: Job 0 finished: take at PreprocessorSpec.scala:34, took 0,269164 s
16/03/01 08:36:18 INFO SparkContext: Starting job: take at PreprocessorSpec.scala:34
16/03/01 08:36:18 INFO DAGScheduler: Got job 1 (take at PreprocessorSpec.scala:34) with 1 output partitions (allowLocal=false)
16/03/01 08:36:18 INFO DAGScheduler: Final stage: ResultStage 1(take at PreprocessorSpec.scala:34)
16/03/01 08:36:18 INFO DAGScheduler: Parents of final stage: List()
16/03/01 08:36:18 INFO DAGScheduler: Missing parents: List()
16/03/01 08:36:18 INFO DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[5] at take at PreprocessorSpec.scala:34), which has no missing parents
16/03/01 08:36:18 INFO MemoryStore: ensureFreeSpace(2984) called with curMem=4687, maxMem=556038881
16/03/01 08:36:18 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 2.9 KB, free 530.3 MB)
16/03/01 08:36:18 INFO MemoryStore: ensureFreeSpace(1685) called with curMem=7671, maxMem=556038881
16/03/01 08:36:18 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 1685.0 B, free 530.3 MB)
16/03/01 08:36:18 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on localhost:41014 (size: 1685.0 B, free: 530.3 MB)
16/03/01 08:36:18 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:874
16/03/01 08:36:18 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (MapPartitionsRDD[5] at take at PreprocessorSpec.scala:34)
16/03/01 08:36:18 INFO TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
16/03/01 08:36:18 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, PROCESS_LOCAL, 1577 bytes)
16/03/01 08:36:18 INFO Executor: Running task 0.0 in stage 1.0 (TID 1)
16/03/01 08:36:18 INFO Executor: Finished task 0.0 in stage 1.0 (TID 1). 1757 bytes result sent to driver
16/03/01 08:36:18 INFO DAGScheduler: ResultStage 1 (take at PreprocessorSpec.scala:34) finished in 0,001 s
16/03/01 08:36:18 INFO TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 12 ms on localhost (1/1)
16/03/01 08:36:18 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool
16/03/01 08:36:18 INFO DAGScheduler: Job 1 finished: take at PreprocessorSpec.scala:34, took 0,022106 s
16/03/01 08:36:18 INFO SparkContext: Starting job: take at PreprocessorSpec.scala:34
16/03/01 08:36:18 INFO DAGScheduler: Got job 2 (take at PreprocessorSpec.scala:34) with 1 output partitions (allowLocal=false)
16/03/01 08:36:18 INFO DAGScheduler: Final stage: ResultStage 2(take at PreprocessorSpec.scala:34)
16/03/01 08:36:18 INFO DAGScheduler: Parents of final stage: List()
16/03/01 08:36:18 INFO DAGScheduler: Missing parents: List()
16/03/01 08:36:18 INFO DAGScheduler: Submitting ResultStage 2 (MapPartitionsRDD[8] at take at PreprocessorSpec.scala:34), which has no missing parents
16/03/01 08:36:18 INFO MemoryStore: ensureFreeSpace(2984) called with curMem=9356, maxMem=556038881
16/03/01 08:36:18 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 2.9 KB, free 530.3 MB)
16/03/01 08:36:18 INFO MemoryStore: ensureFreeSpace(1691) called with curMem=12340, maxMem=556038881
16/03/01 08:36:18 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 1691.0 B, free 530.3 MB)
16/03/01 08:36:18 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on localhost:41014 (size: 1691.0 B, free: 530.3 MB)
16/03/01 08:36:18 INFO SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:874
16/03/01 08:36:18 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 2 (MapPartitionsRDD[8] at take at PreprocessorSpec.scala:34)
16/03/01 08:36:18 INFO TaskSchedulerImpl: Adding task set 2.0 with 1 tasks
16/03/01 08:36:18 INFO TaskSetManager: Starting task 0.0 in stage 2.0 (TID 2, localhost, PROCESS_LOCAL, 1582 bytes)
16/03/01 08:36:18 INFO Executor: Running task 0.0 in stage 2.0 (TID 2)
16/03/01 08:36:18 INFO Executor: Finished task 0.0 in stage 2.0 (TID 2). 1763 bytes result sent to driver
16/03/01 08:36:18 INFO TaskSetManager: Finished task 0.0 in stage 2.0 (TID 2) in 16 ms on localhost (1/1)
16/03/01 08:36:18 INFO TaskSchedulerImpl: Removed TaskSet 2.0, whose tasks have all completed, from pool
16/03/01 08:36:18 INFO DAGScheduler: ResultStage 2 (take at PreprocessorSpec.scala:34) finished in 0,002 s
16/03/01 08:36:18 INFO DAGScheduler: Job 2 finished: take at PreprocessorSpec.scala:34, took 0,027984 s
16/03/01 08:36:18 INFO SparkContext: Starting job: take at PreprocessorSpec.scala:34
16/03/01 08:36:18 INFO DAGScheduler: Got job 3 (take at PreprocessorSpec.scala:34) with 1 output partitions (allowLocal=false)
16/03/01 08:36:18 INFO DAGScheduler: Final stage: ResultStage 3(take at PreprocessorSpec.scala:34)
16/03/01 08:36:18 INFO DAGScheduler: Parents of final stage: List()
16/03/01 08:36:18 INFO DAGScheduler: Missing parents: List()
16/03/01 08:36:18 INFO DAGScheduler: Submitting ResultStage 3 (MapPartitionsRDD[11] at take at PreprocessorSpec.scala:34), which has no missing parents
16/03/01 08:36:18 INFO MemoryStore: ensureFreeSpace(2984) called with curMem=14031, maxMem=556038881
16/03/01 08:36:18 INFO MemoryStore: Block broadcast_3 stored as values in memory (estimated size 2.9 KB, free 530.3 MB)
16/03/01 08:36:18 INFO MemoryStore: ensureFreeSpace(1690) called with curMem=17015, maxMem=556038881
16/03/01 08:36:18 INFO MemoryStore: Block broadcast_3_piece0 stored as bytes in memory (estimated size 1690.0 B, free 530.3 MB)
16/03/01 08:36:18 INFO BlockManagerInfo: Added broadcast_3_piece0 in memory on localhost:41014 (size: 1690.0 B, free: 530.3 MB)
16/03/01 08:36:18 INFO SparkContext: Created broadcast 3 from broadcast at DAGScheduler.scala:874
16/03/01 08:36:18 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 3 (MapPartitionsRDD[11] at take at PreprocessorSpec.scala:34)
16/03/01 08:36:18 INFO TaskSchedulerImpl: Adding task set 3.0 with 1 tasks
16/03/01 08:36:18 INFO TaskSetManager: Starting task 0.0 in stage 3.0 (TID 3, localhost, PROCESS_LOCAL, 1580 bytes)
16/03/01 08:36:18 INFO Executor: Running task 0.0 in stage 3.0 (TID 3)
16/03/01 08:36:18 INFO Executor: Finished task 0.0 in stage 3.0 (TID 3). 1759 bytes result sent to driver
16/03/01 08:36:18 INFO TaskSetManager: Finished task 0.0 in stage 3.0 (TID 3) in 7 ms on localhost (1/1)
16/03/01 08:36:18 INFO TaskSchedulerImpl: Removed TaskSet 3.0, whose tasks have all completed, from pool
16/03/01 08:36:18 INFO DAGScheduler: ResultStage 3 (take at PreprocessorSpec.scala:34) finished in 0,008 s
16/03/01 08:36:18 INFO DAGScheduler: Job 3 finished: take at PreprocessorSpec.scala:34, took 0,023639 s
16/03/01 08:36:18 INFO SparkContext: Starting job: take at PreprocessorSpec.scala:50
16/03/01 08:36:18 INFO DAGScheduler: Got job 4 (take at PreprocessorSpec.scala:50) with 1 output partitions (allowLocal=false)
16/03/01 08:36:18 INFO DAGScheduler: Final stage: ResultStage 4(take at PreprocessorSpec.scala:50)
16/03/01 08:36:18 INFO DAGScheduler: Parents of final stage: List()
16/03/01 08:36:18 INFO DAGScheduler: Missing parents: List()
16/03/01 08:36:18 INFO DAGScheduler: Submitting ResultStage 4 (MapPartitionsRDD[14] at take at PreprocessorSpec.scala:50), which has no missing parents
16/03/01 08:36:18 INFO MemoryStore: ensureFreeSpace(3208) called with curMem=18705, maxMem=556038881
16/03/01 08:36:18 INFO MemoryStore: Block broadcast_4 stored as values in memory (estimated size 3.1 KB, free 530.3 MB)
16/03/01 08:36:18 INFO MemoryStore: ensureFreeSpace(1770) called with curMem=21913, maxMem=556038881
16/03/01 08:36:18 INFO MemoryStore: Block broadcast_4_piece0 stored as bytes in memory (estimated size 1770.0 B, free 530.3 MB)
16/03/01 08:36:18 INFO BlockManagerInfo: Added broadcast_4_piece0 in memory on localhost:41014 (size: 1770.0 B, free: 530.3 MB)
16/03/01 08:36:18 INFO SparkContext: Created broadcast 4 from broadcast at DAGScheduler.scala:874
16/03/01 08:36:18 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 4 (MapPartitionsRDD[14] at take at PreprocessorSpec.scala:50)
16/03/01 08:36:18 INFO TaskSchedulerImpl: Adding task set 4.0 with 1 tasks
16/03/01 08:36:18 INFO TaskSetManager: Starting task 0.0 in stage 4.0 (TID 4, localhost, PROCESS_LOCAL, 1537 bytes)
16/03/01 08:36:18 INFO Executor: Running task 0.0 in stage 4.0 (TID 4)
16/03/01 08:36:18 INFO Executor: Finished task 0.0 in stage 4.0 (TID 4). 1974 bytes result sent to driver
16/03/01 08:36:18 INFO TaskSetManager: Finished task 0.0 in stage 4.0 (TID 4) in 8 ms on localhost (1/1)
16/03/01 08:36:18 INFO DAGScheduler: ResultStage 4 (take at PreprocessorSpec.scala:50) finished in 0,009 s
16/03/01 08:36:18 INFO TaskSchedulerImpl: Removed TaskSet 4.0, whose tasks have all completed, from pool
16/03/01 08:36:18 INFO DAGScheduler: Job 4 finished: take at PreprocessorSpec.scala:50, took 0,020782 s
16/03/01 08:36:18 INFO SparkContext: Starting job: take at PreprocessorSpec.scala:50
16/03/01 08:36:18 INFO DAGScheduler: Got job 5 (take at PreprocessorSpec.scala:50) with 1 output partitions (allowLocal=false)
16/03/01 08:36:18 INFO DAGScheduler: Final stage: ResultStage 5(take at PreprocessorSpec.scala:50)
16/03/01 08:36:18 INFO DAGScheduler: Parents of final stage: List()
16/03/01 08:36:18 INFO DAGScheduler: Missing parents: List()
16/03/01 08:36:18 INFO DAGScheduler: Submitting ResultStage 5 (MapPartitionsRDD[17] at take at PreprocessorSpec.scala:50), which has no missing parents
16/03/01 08:36:18 INFO MemoryStore: ensureFreeSpace(2984) called with curMem=23683, maxMem=556038881
16/03/01 08:36:18 INFO MemoryStore: Block broadcast_5 stored as values in memory (estimated size 2.9 KB, free 530.3 MB)
16/03/01 08:36:18 INFO MemoryStore: ensureFreeSpace(1687) called with curMem=26667, maxMem=556038881
16/03/01 08:36:18 INFO MemoryStore: Block broadcast_5_piece0 stored as bytes in memory (estimated size 1687.0 B, free 530.3 MB)
16/03/01 08:36:18 INFO BlockManagerInfo: Added broadcast_5_piece0 in memory on localhost:41014 (size: 1687.0 B, free: 530.3 MB)
16/03/01 08:36:18 INFO SparkContext: Created broadcast 5 from broadcast at DAGScheduler.scala:874
16/03/01 08:36:18 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 5 (MapPartitionsRDD[17] at take at PreprocessorSpec.scala:50)
16/03/01 08:36:18 INFO TaskSchedulerImpl: Adding task set 5.0 with 1 tasks
16/03/01 08:36:18 INFO TaskSetManager: Starting task 0.0 in stage 5.0 (TID 5, localhost, PROCESS_LOCAL, 1515 bytes)
16/03/01 08:36:18 INFO Executor: Running task 0.0 in stage 5.0 (TID 5)
16/03/01 08:36:18 INFO Executor: Finished task 0.0 in stage 5.0 (TID 5). 1709 bytes result sent to driver
16/03/01 08:36:18 INFO TaskSetManager: Finished task 0.0 in stage 5.0 (TID 5) in 8 ms on localhost (1/1)
16/03/01 08:36:18 INFO TaskSchedulerImpl: Removed TaskSet 5.0, whose tasks have all completed, from pool
16/03/01 08:36:18 INFO DAGScheduler: ResultStage 5 (take at PreprocessorSpec.scala:50) finished in 0,009 s
16/03/01 08:36:18 INFO DAGScheduler: Job 5 finished: take at PreprocessorSpec.scala:50, took 0,017768 s
16/03/01 08:36:18 INFO SparkContext: Starting job: take at PreprocessorSpec.scala:68
16/03/01 08:36:18 INFO DAGScheduler: Got job 6 (take at PreprocessorSpec.scala:68) with 1 output partitions (allowLocal=false)
16/03/01 08:36:18 INFO DAGScheduler: Final stage: ResultStage 6(take at PreprocessorSpec.scala:68)
16/03/01 08:36:18 INFO DAGScheduler: Parents of final stage: List()
16/03/01 08:36:18 INFO DAGScheduler: Missing parents: List()
16/03/01 08:36:18 INFO DAGScheduler: Submitting ResultStage 6 (MapPartitionsRDD[20] at take at PreprocessorSpec.scala:68), which has no missing parents
16/03/01 08:36:18 INFO MemoryStore: ensureFreeSpace(3208) called with curMem=28354, maxMem=556038881
16/03/01 08:36:18 INFO MemoryStore: Block broadcast_6 stored as values in memory (estimated size 3.1 KB, free 530.2 MB)
16/03/01 08:36:18 INFO MemoryStore: ensureFreeSpace(1770) called with curMem=31562, maxMem=556038881
16/03/01 08:36:18 INFO MemoryStore: Block broadcast_6_piece0 stored as bytes in memory (estimated size 1770.0 B, free 530.2 MB)
16/03/01 08:36:18 INFO BlockManagerInfo: Added broadcast_6_piece0 in memory on localhost:41014 (size: 1770.0 B, free: 530.3 MB)
16/03/01 08:36:18 INFO SparkContext: Created broadcast 6 from broadcast at DAGScheduler.scala:874
16/03/01 08:36:18 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 6 (MapPartitionsRDD[20] at take at PreprocessorSpec.scala:68)
16/03/01 08:36:18 INFO TaskSchedulerImpl: Adding task set 6.0 with 1 tasks
16/03/01 08:36:18 WARN TaskSetManager: Stage 6 contains a task of very large size (257 KB). The maximum recommended task size is 100 KB.
16/03/01 08:36:18 INFO TaskSetManager: Starting task 0.0 in stage 6.0 (TID 6, localhost, PROCESS_LOCAL, 263669 bytes)
16/03/01 08:36:18 INFO Executor: Running task 0.0 in stage 6.0 (TID 6)
16/03/01 08:36:19 INFO Executor: Finished task 0.0 in stage 6.0 (TID 6). 660504 bytes result sent to driver
16/03/01 08:36:19 INFO TaskSetManager: Finished task 0.0 in stage 6.0 (TID 6) in 365 ms on localhost (1/1)
16/03/01 08:36:19 INFO TaskSchedulerImpl: Removed TaskSet 6.0, whose tasks have all completed, from pool
16/03/01 08:36:19 INFO DAGScheduler: ResultStage 6 (take at PreprocessorSpec.scala:68) finished in 0,363 s
16/03/01 08:36:19 INFO DAGScheduler: Job 6 finished: take at PreprocessorSpec.scala:68, took 0,383692 s
DefaultPreprocessor converted 256 images in 0.497s
16/03/01 08:36:19 INFO SparkContext: Starting job: take at PreprocessorSpec.scala:68
16/03/01 08:36:19 INFO DAGScheduler: Got job 7 (take at PreprocessorSpec.scala:68) with 1 output partitions (allowLocal=false)
16/03/01 08:36:19 INFO DAGScheduler: Final stage: ResultStage 7(take at PreprocessorSpec.scala:68)
16/03/01 08:36:19 INFO DAGScheduler: Parents of final stage: List()
16/03/01 08:36:19 INFO DAGScheduler: Missing parents: List()
16/03/01 08:36:19 INFO DAGScheduler: Submitting ResultStage 7 (MapPartitionsRDD[23] at take at PreprocessorSpec.scala:68), which has no missing parents
16/03/01 08:36:19 INFO MemoryStore: ensureFreeSpace(2984) called with curMem=33332, maxMem=556038881
16/03/01 08:36:19 INFO MemoryStore: Block broadcast_7 stored as values in memory (estimated size 2.9 KB, free 530.2 MB)
16/03/01 08:36:19 INFO MemoryStore: ensureFreeSpace(1689) called with curMem=36316, maxMem=556038881
16/03/01 08:36:19 INFO MemoryStore: Block broadcast_7_piece0 stored as bytes in memory (estimated size 1689.0 B, free 530.2 MB)
16/03/01 08:36:19 INFO BlockManagerInfo: Added broadcast_7_piece0 in memory on localhost:41014 (size: 1689.0 B, free: 530.3 MB)
16/03/01 08:36:19 INFO SparkContext: Created broadcast 7 from broadcast at DAGScheduler.scala:874
16/03/01 08:36:19 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 7 (MapPartitionsRDD[23] at take at PreprocessorSpec.scala:68)
16/03/01 08:36:19 INFO TaskSchedulerImpl: Adding task set 7.0 with 1 tasks
16/03/01 08:36:19 INFO TaskSetManager: Starting task 0.0 in stage 7.0 (TID 7, localhost, PROCESS_LOCAL, 67048 bytes)
16/03/01 08:36:19 INFO Executor: Running task 0.0 in stage 7.0 (TID 7)
16/03/01 08:36:19 INFO Executor: Finished task 0.0 in stage 7.0 (TID 7). 67562 bytes result sent to driver
16/03/01 08:36:19 INFO TaskSetManager: Finished task 0.0 in stage 7.0 (TID 7) in 8 ms on localhost (1/1)
16/03/01 08:36:19 INFO TaskSchedulerImpl: Removed TaskSet 7.0, whose tasks have all completed, from pool
16/03/01 08:36:19 INFO DAGScheduler: ResultStage 7 (take at PreprocessorSpec.scala:68) finished in 0,001 s
16/03/01 08:36:19 INFO DAGScheduler: Job 7 finished: take at PreprocessorSpec.scala:68, took 0,019660 s
16/03/01 08:36:19 INFO BlockManagerInfo: Removed broadcast_7_piece0 on localhost:41014 in memory (size: 1689.0 B, free: 530.3 MB)
16/03/01 08:36:19 INFO BlockManagerInfo: Removed broadcast_6_piece0 on localhost:41014 in memory (size: 1770.0 B, free: 530.3 MB)
16/03/01 08:36:19 INFO BlockManagerInfo: Removed broadcast_5_piece0 on localhost:41014 in memory (size: 1687.0 B, free: 530.3 MB)
16/03/01 08:36:19 INFO BlockManagerInfo: Removed broadcast_4_piece0 on localhost:41014 in memory (size: 1770.0 B, free: 530.3 MB)
16/03/01 08:36:19 INFO BlockManagerInfo: Removed broadcast_3_piece0 on localhost:41014 in memory (size: 1690.0 B, free: 530.3 MB)
16/03/01 08:36:19 INFO BlockManagerInfo: Removed broadcast_2_piece0 on localhost:41014 in memory (size: 1691.0 B, free: 530.3 MB)
16/03/01 08:36:19 INFO BlockManagerInfo: Removed broadcast_1_piece0 on localhost:41014 in memory (size: 1685.0 B, free: 530.3 MB)
16/03/01 08:36:19 INFO BlockManagerInfo: Removed broadcast_0_piece0 on localhost:41014 in memory (size: 1703.0 B, free: 530.3 MB)
DefaultPreprocessor converted 256 images in 0.577s
16/03/01 08:36:20 INFO SparkContext: Starting job: take at PreprocessorSpec.scala:92
16/03/01 08:36:20 INFO DAGScheduler: Got job 8 (take at PreprocessorSpec.scala:92) with 1 output partitions (allowLocal=false)
16/03/01 08:36:20 INFO DAGScheduler: Final stage: ResultStage 8(take at PreprocessorSpec.scala:92)
16/03/01 08:36:20 INFO DAGScheduler: Parents of final stage: List()
16/03/01 08:36:20 INFO DAGScheduler: Missing parents: List()
16/03/01 08:36:20 INFO DAGScheduler: Submitting ResultStage 8 (MapPartitionsRDD[26] at take at PreprocessorSpec.scala:92), which has no missing parents
16/03/01 08:36:20 INFO MemoryStore: ensureFreeSpace(2984) called with curMem=0, maxMem=556038881
16/03/01 08:36:20 INFO MemoryStore: Block broadcast_8 stored as values in memory (estimated size 2.9 KB, free 530.3 MB)
16/03/01 08:36:20 INFO MemoryStore: ensureFreeSpace(1689) called with curMem=2984, maxMem=556038881
16/03/01 08:36:20 INFO MemoryStore: Block broadcast_8_piece0 stored as bytes in memory (estimated size 1689.0 B, free 530.3 MB)
16/03/01 08:36:20 INFO BlockManagerInfo: Added broadcast_8_piece0 in memory on localhost:41014 (size: 1689.0 B, free: 530.3 MB)
16/03/01 08:36:20 INFO SparkContext: Created broadcast 8 from broadcast at DAGScheduler.scala:874
16/03/01 08:36:20 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 8 (MapPartitionsRDD[26] at take at PreprocessorSpec.scala:92)
16/03/01 08:36:20 INFO TaskSchedulerImpl: Adding task set 8.0 with 1 tasks
16/03/01 08:36:20 INFO TaskSetManager: Starting task 0.0 in stage 8.0 (TID 8, localhost, PROCESS_LOCAL, 1572 bytes)
16/03/01 08:36:20 INFO Executor: Running task 0.0 in stage 8.0 (TID 8)
16/03/01 08:36:20 INFO Executor: Finished task 0.0 in stage 8.0 (TID 8). 1766 bytes result sent to driver
16/03/01 08:36:20 INFO TaskSetManager: Finished task 0.0 in stage 8.0 (TID 8) in 5 ms on localhost (1/1)
16/03/01 08:36:20 INFO TaskSchedulerImpl: Removed TaskSet 8.0, whose tasks have all completed, from pool
16/03/01 08:36:20 INFO DAGScheduler: ResultStage 8 (take at PreprocessorSpec.scala:92) finished in 0,001 s
16/03/01 08:36:20 INFO DAGScheduler: Job 8 finished: take at PreprocessorSpec.scala:92, took 0,016527 s
16/03/01 08:36:20 INFO SparkContext: Starting job: take at PreprocessorSpec.scala:107
16/03/01 08:36:20 INFO DAGScheduler: Got job 9 (take at PreprocessorSpec.scala:107) with 1 output partitions (allowLocal=false)
16/03/01 08:36:20 INFO DAGScheduler: Final stage: ResultStage 9(take at PreprocessorSpec.scala:107)
16/03/01 08:36:20 INFO DAGScheduler: Parents of final stage: List()
16/03/01 08:36:20 INFO DAGScheduler: Missing parents: List()
16/03/01 08:36:20 INFO DAGScheduler: Submitting ResultStage 9 (MapPartitionsRDD[29] at take at PreprocessorSpec.scala:107), which has no missing parents
16/03/01 08:36:20 INFO MemoryStore: ensureFreeSpace(2984) called with curMem=4673, maxMem=556038881
16/03/01 08:36:20 INFO MemoryStore: Block broadcast_9 stored as values in memory (estimated size 2.9 KB, free 530.3 MB)
16/03/01 08:36:20 INFO MemoryStore: ensureFreeSpace(1690) called with curMem=7657, maxMem=556038881
16/03/01 08:36:20 INFO MemoryStore: Block broadcast_9_piece0 stored as bytes in memory (estimated size 1690.0 B, free 530.3 MB)
16/03/01 08:36:20 INFO BlockManagerInfo: Added broadcast_9_piece0 in memory on localhost:41014 (size: 1690.0 B, free: 530.3 MB)
16/03/01 08:36:20 INFO SparkContext: Created broadcast 9 from broadcast at DAGScheduler.scala:874
16/03/01 08:36:20 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 9 (MapPartitionsRDD[29] at take at PreprocessorSpec.scala:107)
16/03/01 08:36:20 INFO TaskSchedulerImpl: Adding task set 9.0 with 1 tasks
16/03/01 08:36:20 INFO TaskSetManager: Starting task 0.0 in stage 9.0 (TID 9, localhost, PROCESS_LOCAL, 1572 bytes)
16/03/01 08:36:20 INFO Executor: Running task 0.0 in stage 9.0 (TID 9)
16/03/01 08:36:20 INFO Executor: Finished task 0.0 in stage 9.0 (TID 9). 1766 bytes result sent to driver
16/03/01 08:36:20 INFO TaskSetManager: Finished task 0.0 in stage 9.0 (TID 9) in 7 ms on localhost (1/1)
16/03/01 08:36:20 INFO TaskSchedulerImpl: Removed TaskSet 9.0, whose tasks have all completed, from pool
16/03/01 08:36:20 INFO DAGScheduler: ResultStage 9 (take at PreprocessorSpec.scala:107) finished in 0,006 s
16/03/01 08:36:20 INFO DAGScheduler: Job 9 finished: take at PreprocessorSpec.scala:107, took 0,016329 s
16/03/01 08:36:20 INFO SparkContext: Starting job: take at PreprocessorSpec.scala:128
16/03/01 08:36:20 INFO DAGScheduler: Got job 10 (take at PreprocessorSpec.scala:128) with 1 output partitions (allowLocal=false)
16/03/01 08:36:20 INFO DAGScheduler: Final stage: ResultStage 10(take at PreprocessorSpec.scala:128)
16/03/01 08:36:20 INFO DAGScheduler: Parents of final stage: List()
16/03/01 08:36:20 INFO DAGScheduler: Missing parents: List()
16/03/01 08:36:20 INFO DAGScheduler: Submitting ResultStage 10 (MapPartitionsRDD[32] at take at PreprocessorSpec.scala:128), which has no missing parents
16/03/01 08:36:20 INFO MemoryStore: ensureFreeSpace(2984) called with curMem=9347, maxMem=556038881
16/03/01 08:36:20 INFO MemoryStore: Block broadcast_10 stored as values in memory (estimated size 2.9 KB, free 530.3 MB)
16/03/01 08:36:20 INFO MemoryStore: ensureFreeSpace(1687) called with curMem=12331, maxMem=556038881
16/03/01 08:36:20 INFO MemoryStore: Block broadcast_10_piece0 stored as bytes in memory (estimated size 1687.0 B, free 530.3 MB)
16/03/01 08:36:20 INFO BlockManagerInfo: Added broadcast_10_piece0 in memory on localhost:41014 (size: 1687.0 B, free: 530.3 MB)
16/03/01 08:36:20 INFO SparkContext: Created broadcast 10 from broadcast at DAGScheduler.scala:874
16/03/01 08:36:20 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 10 (MapPartitionsRDD[32] at take at PreprocessorSpec.scala:128)
16/03/01 08:36:20 INFO TaskSchedulerImpl: Adding task set 10.0 with 1 tasks
16/03/01 08:36:20 WARN TaskSetManager: Stage 10 contains a task of very large size (193 KB). The maximum recommended task size is 100 KB.
16/03/01 08:36:20 INFO TaskSetManager: Starting task 0.0 in stage 10.0 (TID 10, localhost, PROCESS_LOCAL, 198120 bytes)
16/03/01 08:36:20 INFO Executor: Running task 0.0 in stage 10.0 (TID 10)
16/03/01 08:36:20 INFO Executor: Finished task 0.0 in stage 10.0 (TID 10). 199274 bytes result sent to driver
16/03/01 08:36:20 INFO TaskSetManager: Finished task 0.0 in stage 10.0 (TID 10) in 7 ms on localhost (1/1)
16/03/01 08:36:20 INFO TaskSchedulerImpl: Removed TaskSet 10.0, whose tasks have all completed, from pool
16/03/01 08:36:20 INFO DAGScheduler: ResultStage 10 (take at PreprocessorSpec.scala:128) finished in 0,007 s
16/03/01 08:36:20 INFO DAGScheduler: Job 10 finished: take at PreprocessorSpec.scala:128, took 0,016381 s
16/03/01 08:36:20 INFO BlockManagerInfo: Removed broadcast_10_piece0 on localhost:41014 in memory (size: 1687.0 B, free: 530.3 MB)
16/03/01 08:36:20 INFO BlockManagerInfo: Removed broadcast_9_piece0 on localhost:41014 in memory (size: 1690.0 B, free: 530.3 MB)
16/03/01 08:36:20 INFO BlockManagerInfo: Removed broadcast_8_piece0 on localhost:41014 in memory (size: 1689.0 B, free: 530.3 MB)
ImageNetPreprocessor converted 256 images in 0.226s
16/03/01 08:36:20 INFO SparkUI: Stopped Spark web UI at http://192.168.56.4:4040
16/03/01 08:36:20 INFO DAGScheduler: Stopping DAGScheduler
16/03/01 08:36:20 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/03/01 08:36:20 INFO Utils: path = /tmp/spark-d59ce13b-ce92-4a6b-9f48-309abf18c6a4/blockmgr-448725ff-1b71-4d13-afa2-91df5e250370, already present as root for deletion.
16/03/01 08:36:20 INFO MemoryStore: MemoryStore cleared
16/03/01 08:36:20 INFO BlockManager: BlockManager stopped
16/03/01 08:36:20 INFO BlockManagerMaster: BlockManagerMaster stopped
16/03/01 08:36:20 INFO SparkContext: Successfully stopped SparkContext
[info] PreprocessorSpec:
[info] DefaultPreprocessor
[info] - should preserve scalar values
[info] DefaultPreprocessor
[info] - should preserve array values16/03/01 08:36:20 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
[info] DefaultPreprocessor
[info] - should be fast
[info] ImageNetPreprocessor
[info] - should subtract mean
[info] ImageNetPreprocessor
[info] - should subtract mean and crop image
[info] ImageNetPreprocessor
[info] - should be fast
[info] LoadAdultDataSpec:
[info] - should be able to load the adult dataset !!! IGNORED !!!
16/03/01 08:36:20 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
16/03/01 08:36:20 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
[info] Run completed in 7 seconds, 410 milliseconds.

[info] Total number of tests run: 6
[info] Suites: completed 4, aborted 0
[info] Tests: succeeded 6, failed 0, canceled 0, ignored 1, pending 0
[info] All tests passed.
[error] Error during tests:
[error]         CaffeNetSpec
16/03/01 08:36:20 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.

[error] (test:test) sbt.TestsFailedException: Tests unsuccessful
[error] Total time: 9 s, completed 01-mar-2016 8:36:20
16/03/01 08:36:20 INFO Utils: Shutdown hook called
16/03/01 08:36:20 INFO Utils: Deleting directory /tmp/spark-d59ce13b-ce92-4a6b-9f48-309abf18c6a4

2016-02-29 19:50 GMT+01:00 Robert Nishihara <robertn...@gmail.com>:
Can you post the full error message? There should be some more information further down (e.g., about a missing .so file).

Can you tell us a bit more about your setup? What operating system (and version) are you using?

On Mon, Feb 29, 2016 at 12:12 AM MªLuz Morales <mlz...@gmail.com> wrote:
Hi,
thanks for your response.
Now, I get the following error when I put sbt assembly:

[error] Could not run test CaffeNetSpec: java.lang.UnsatisfiedLinkError: no jniopencv_core in java.library.path


2016-02-26 18:17 GMT+01:00 Robert Nishihara <robertn...@gmail.com>:
This can be fixed by running

    export SPARKNET_HOME=/root/SparkNet/

from the command line (but replace /root/SparkNet/ by the path to the SparkNet directory on your machine). It probably makes sense to just put that line in your ~/.bashrc so that it happens automatically.

On Fri, Feb 26, 2016 at 2:08 AM MªLuz Morales <mlz...@gmail.com> wrote:
Hi,
I tried
cd SparkNet/
sbt assembly
and I got the following error:

[info] NDArraySpec:
java.util.NoSuchElementException: key not found: SPARKNET_HOME
        at scala.collection.MapLike$class.default(MapLike.scala:228)
        at scala.collection.AbstractMap.default(Map.scala:58)
        at scala.collection.MapLike$class.apply(MapLike.scala:141)
        at scala.collection.AbstractMap.apply(Map.scala:58)
        at CaffeNetSpec.<init>(CaffeNetSpec.scala:10)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at java.lang.Class.newInstance(Class.java:383)
        at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:621)
        at sbt.TestRunner.runTest$1(TestFramework.scala:76)
        at sbt.TestRunner.run(TestFramework.scala:85)
        at sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:202)
        at sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:202)
        at sbt.TestFramework$.sbt$TestFramework$$withContextLoader(TestFramework.scala:185)
        at sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:202)
        at sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:202)
        at sbt.TestFunction.apply(TestFramework.scala:207)
        at sbt.Tests$.sbt$Tests$$processRunnable$1(Tests.scala:239)
        at sbt.Tests$$anonfun$makeSerial$1.apply(Tests.scala:245)
        at sbt.Tests$$anonfun$makeSerial$1.apply(Tests.scala:245)
        at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:44)
        at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:44)
        at sbt.std.Transform$$anon$4.work(System.scala:63)
        at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
        at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
        at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
        at sbt.Execute.work(Execute.scala:237)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
        at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
        at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
[error] Could not run test CaffeNetSpec: java.util.NoSuchElementException: key not found: SPARKNET_HOME
...

[info] Total number of tests run: 6
[info] Suites: completed 4, aborted 0
[info] Tests: succeeded 6, failed 0, canceled 0, ignored 1, pending 0
[info] All tests passed.
[error] Error during tests:
[error]         CaffeNetSpec
[error] (test:test) sbt.TestsFailedException: Tests unsuccessful
[error] Total time: 7 s, completed 26-feb-2016 11:05:06

Regards

 

Philipp Moritz

unread,
Mar 2, 2016, 3:40:45 AM3/2/16
to sparknet-users
Hi,

can you try using the new .jars we just uploaded?


We uploaded .jars that were built with g++ 4.9 at one point which you might have used (we are now putting a timestamp on our snapshots to avoid this kind of problem in the future).

If the error persists, can you post g++ --version? If you have a custom version of g++ that is older than the g++ 4.8.4 we used for building, this could be a source of problems.

Best,
Philipp.
16/03/01 08:36:16 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.56.4:46236]
...

Robert Nishihara

unread,
Mar 2, 2016, 3:46:27 AM3/2/16
to Philipp Moritz, sparknet-users
To be clear, we are no longer using g++ 4.9.

To use the new jars, you will probably have to clear out remove caches. Something like:
    rm -r SparkNet/targets
    rm -r /root/.ivy2/cache/org.bytedeco*

though the specific directories may vary on your machine.

16/03/01 08:36:16 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://spark...@192.168.56.4:46236]
...

--
You received this message because you are subscribed to the Google Groups "sparknet-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to sparknet-user...@googlegroups.com.
To post to this group, send email to sparkne...@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages