java.util.NoSuchElementException: None.get

1,753 views
Skip to first unread message

Digsss

unread,
Dec 7, 2016, 8:12:55 AM12/7/16
to actionml-user
I am using UR 0.3.0 and getting this error while training:

[ERROR] [Executor] Exception in task 0.0 in stage 25.0 (TID 448)
[WARN] [TaskSetManager] Lost task 0.0 in stage 25.0 (TID 448, localhost): java.util.NoSuchElementException: None.get
        at scala.None$.get(Option.scala:313)
        at scala.None$.get(Option.scala:311)
        at org.template.PopModel$$anonfun$3.apply(PopModel.scala:69)
        at org.template.PopModel$$anonfun$3.apply(PopModel.scala:69)
        at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
        at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:201)
        at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:56)
        at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:70)
        at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
        at org.apache.spark.scheduler.Task.run(Task.scala:70)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

[ERROR] [TaskSetManager] Task 0 in stage 25.0 failed 1 times; aborting job
[WARN] [TaskSetManager] Lost task 2.0 in stage 25.0 (TID 450, localhost): TaskKilled (killed intentionally)
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 25.0 failed 1 times, most recent failure: Lost task 0.0 in stage 25.0 (TID 448, localhost): java.util.NoSuchElementException: None.get
        at scala.None$.get(Option.scala:313)
        at scala.None$.get(Option.scala:311)
        at org.template.PopModel$$anonfun$3.apply(PopModel.scala:69)
        at org.template.PopModel$$anonfun$3.apply(PopModel.scala:69)
        at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
        at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:201)
        at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:56)
        at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:70)
        at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
        at org.apache.spark.scheduler.Task.run(Task.scala:70)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

Driver stacktrace:
        at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1273)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1264)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1263)
        at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
        at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1263)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
        at scala.Option.foreach(Option.scala:236)
        at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:730)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1457)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1418)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)

Digsss

unread,
Dec 8, 2016, 1:11:13 AM12/8/16
to actionml-user
Please help..

Pat Ferrel

unread,
Dec 8, 2016, 7:33:07 AM12/8/16
to Digsss, actionml-user
It’s very hard to support such an old version of the UR. The only time I’ve seen this is when there was a bad build issue. Try removing any jars in target/scala-2.10 There should be 2 with the same version number, maybe you have 4 of conflicting version numbers.

We are moving to Apache PredictionIO 0.10.0-incubatring and The UR v0.5.0. It might be time to upgrade.

 
-- 
You received this message because you are subscribed to the Google Groups "actionml-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to actionml-use...@googlegroups.com.
To post to this group, send email to action...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/actionml-user/f68fc668-aca6-4cc5-98c1-dabd5a077a7c%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Digsss

unread,
Dec 8, 2016, 8:08:20 AM12/8/16
to actionml-user
Does Apache Mahout's required version released? I mean Apache Mahout's separate build process not needed now?

Pat Ferrel

unread,
Dec 8, 2016, 8:32:03 AM12/8/16
to Digsss, actionml-user
It is required for the UR v0.4.2 and up


Digsss

unread,
Dec 9, 2016, 1:05:07 AM12/9/16
to actionml-user
Yes that's why I am not upgrading. Will update to latest version once that dependency gets removed.
Reply all
Reply to author
Forward
0 new messages