run OLAP together

40 views
Skip to first unread message

Ranger Tsao

unread,
May 4, 2016, 5:28:49 AM5/4/16
to Gremlin-users

TinkerPop version:3.1.1


start a gremlin-server with a hadoop graph[hadoop-gryo.properties]


start two gremlin-console connect the gremlin-server,both submit a OLAP (g.V().count())


then gremlin-server throw exception:



[WARN] AbstractEvalOpProcessor - Exception processing a script on request [RequestMessage{, requestId=c090d1d9-9cef-4ac4-819b-a3b2dd472274, op='eval', processor='', args={gremlin=g.V().count(), bindings={}, batchSize=64}}].

java.lang.IllegalStateException: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 3.0 failed 1 times, most recent failure: Lost task 0.0 in stage 3.0 (TID 1, localhost): org.apache.spark.SparkException: File /private/var/folders/24/fxdznvxj4f1f_wqkxpx9hth80000gn/T/spark-e266b627-b2cd-4802-8284-cb68bfc86566/userFiles-c4c98373-c20a-4e6f-ab3e-9f8b61818d62/jackson-mapper-asl-1.9.13.jar exists and does not match contents of http://localhost:61621/jars/jackson-mapper-asl-1.9.13.jar

at org.apache.spark.util.Utils$.copyFile(Utils.scala:464)

at org.apache.spark.util.Utils$.downloadFile(Utils.scala:416)

at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:557)

at org.apache.spark.util.Utils$.fetchFile(Utils.scala:369)

at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:405)

at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:397)

at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)

at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)

at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)

at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)

at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)

at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)

at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)

at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:397)

at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:193)

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

at java.lang.Thread.run(Thread.java:745)


Driver stacktrace:

at org.apache.tinkerpop.gremlin.process.computer.traversal.step.map.ComputerResultStep.processNextStart(ComputerResultStep.java:82)

at org.apache.tinkerpop.gremlin.process.traversal.step.util.AbstractStep.hasNext(AbstractStep.java:140)

at org.apache.tinkerpop.gremlin.process.traversal.util.DefaultTraversal.hasNext(DefaultTraversal.java:147)

at org.apache.tinkerpop.gremlin.server.op.AbstractEvalOpProcessor.handleIterator(AbstractEvalOpProcessor.java:249)

at org.apache.tinkerpop.gremlin.server.op.AbstractEvalOpProcessor.lambda$evalOpInternal$54(AbstractEvalOpProcessor.java:199)

at org.apache.tinkerpop.gremlin.groovy.engine.GremlinExecutor.lambda$eval$0(GremlinExecutor.java:270)

at java.util.concurrent.FutureTask.run(FutureTask.java:266)

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

at java.lang.Thread.run(Thread.java:745)

Caused by: java.util.concurrent.ExecutionException: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 3.0 failed 1 times, most recent failure: Lost task 0.0 in stage 3.0 (TID 1, localhost): org.apache.spark.SparkException: File /private/var/folders/24/fxdznvxj4f1f_wqkxpx9hth80000gn/T/spark-e266b627-b2cd-4802-8284-cb68bfc86566/userFiles-c4c98373-c20a-4e6f-ab3e-9f8b61818d62/jackson-mapper-asl-1.9.13.jar exists and does not match contents of http://localhost:61621/jars/jackson-mapper-asl-1.9.13.jar

at org.apache.spark.util.Utils$.copyFile(Utils.scala:464)

at org.apache.spark.util.Utils$.downloadFile(Utils.scala:416)

at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:557)

at org.apache.spark.util.Utils$.fetchFile(Utils.scala:369)

at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:405)

at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:397)

at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)

at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)

at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)

at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)

at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)

at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)

at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)

at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:397)

at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:193)

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

at java.lang.Thread.run(Thread.java:745)


Driver stacktrace:

at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)

at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)

at org.apache.tinkerpop.gremlin.process.computer.traversal.step.map.ComputerResultStep.processNextStart(ComputerResultStep.java:80)

... 9 more

Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 3.0 failed 1 times, most recent failure: Lost task 0.0 in stage 3.0 (TID 1, localhost): org.apache.spark.SparkException: File /private/var/folders/24/fxdznvxj4f1f_wqkxpx9hth80000gn/T/spark-e266b627-b2cd-4802-8284-cb68bfc86566/userFiles-c4c98373-c20a-4e6f-ab3e-9f8b61818d62/jackson-mapper-asl-1.9.13.jar exists and does not match contents of http://localhost:61621/jars/jackson-mapper-asl-1.9.13.jar

at org.apache.spark.util.Utils$.copyFile(Utils.scala:464)

at org.apache.spark.util.Utils$.downloadFile(Utils.scala:416)

at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:557)

at org.apache.spark.util.Utils$.fetchFile(Utils.scala:369)

at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:405)

at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:397)

at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)

at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)

at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)

at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)

at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)

at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)

at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)

at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:397)

at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:193)

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

at java.lang.Thread.run(Thread.java:745)


Driver stacktrace:

at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1283)

at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1271)

at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1270)

at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)

at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)

at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1270)

at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)

at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)

at scala.Option.foreach(Option.scala:236)

at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:697)

at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1496)

at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1458)

at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1447)

at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)

at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:567)

at org.apache.spark.SparkContext.runJob(SparkContext.scala:1824)

at org.apache.spark.SparkContext.runJob(SparkContext.scala:1837)

at org.apache.spark.SparkContext.runJob(SparkContext.scala:1850)

at org.apache.spark.SparkContext.runJob(SparkContext.scala:1921)

at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:902)

at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:900)

at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)

at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)

at org.apache.spark.rdd.RDD.withScope(RDD.scala:310)

at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:900)

at org.apache.spark.api.java.JavaRDDLike$class.foreachPartition(JavaRDDLike.scala:222)

at org.apache.spark.api.java.AbstractJavaRDDLike.foreachPartition(JavaRDDLike.scala:47)

at org.apache.tinkerpop.gremlin.spark.process.computer.SparkExecutor.executeVertexProgramIteration(SparkExecutor.java:134)

at org.apache.tinkerpop.gremlin.spark.process.computer.SparkGraphComputer.lambda$submitWithExecutor$28(SparkGraphComputer.java:211)

at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1590)

... 3 more

Caused by: org.apache.spark.SparkException: File /private/var/folders/24/fxdznvxj4f1f_wqkxpx9hth80000gn/T/spark-e266b627-b2cd-4802-8284-cb68bfc86566/userFiles-c4c98373-c20a-4e6f-ab3e-9f8b61818d62/jackson-mapper-asl-1.9.13.jar exists and does not match contents of http://localhost:61621/jars/jackson-mapper-asl-1.9.13.jar

at org.apache.spark.util.Utils$.copyFile(Utils.scala:464)

at org.apache.spark.util.Utils$.downloadFile(Utils.scala:416)

at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:557)

at org.apache.spark.util.Utils$.fetchFile(Utils.scala:369)

at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:405)

at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:397)

at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)

at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)

at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)

at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)

at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)

at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)

at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)

at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:397)

at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:193)

... 3 more

Stephen Mallette

unread,
May 5, 2016, 3:21:11 PM5/5/16
to Gremlin-users
I'm not sure what the problem is here, but I think you should create an issue in JIRA as I sense this is a bug.

--
You received this message because you are subscribed to the Google Groups "Gremlin-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to gremlin-user...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/gremlin-users/2b11c76b-e81b-41d5-ba56-d18058caf01e%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply all
Reply to author
Forward
0 new messages