Tests failing

60 views
Skip to first unread message

Evan Chan

unread,
Jul 20, 2013, 7:10:23 PM7/20/13
to spark-de...@googlegroups.com
Hey guys,

I haven't updated my master branch in about 2-3 weeks and the DistributedSuite, amongst others, seems to fail for me when I run "sbt test" from the root project.

✦  git log --oneline | head -3
c40f0f2 Merge pull request #711 from shivaram/ml-generators
413b841 Merge pull request #717 from viirya/dev1
d1738d7 also exclude asm for hadoop2. hadoop1 looks like no need to do that too.


[info] DistributedSuite:
[info] - task throws not serializable exception
[info] - local-cluster format
[info] - simple groupByKey
[info] - groupByKey where map output sizes exceed maxMbInFlight *** FAILED ***
[info]   spark.SparkException: Job failed: Task 1.0:1 failed more than 4 times
[info]   at spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:709)
[info]   at spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:707)
[info]   at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:60)
[info]   at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
[info]   at spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:707)
[info]   at spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:352)
[info]   at spark.scheduler.DAGScheduler.spark$scheduler$DAGScheduler$$run(DAGScheduler.scala:414)
[info]   at spark.scheduler.DAGScheduler$$anon$1.run(DAGScheduler.scala:132)
[info]   ...
[info] - accumulators *** FAILED ***
[info]   spark.SparkException: Job failed: Task 0.0:0 failed more than 4 times
[info]   at spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:709)
[info]   at spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:707)
[info]   at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:60)
[info]   at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
[info]   at spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:707)
[info]   at spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:352)
[info]   at spark.scheduler.DAGScheduler.spark$scheduler$DAGScheduler$$run(DAGScheduler.scala:414)
[info]   at spark.scheduler.DAGScheduler$$anon$1.run(DAGScheduler.scala:132)
[info]   ...
[info] - broadcast variables *** FAILED ***
[info]   spark.SparkException: Job failed: Task 0.0:0 failed more than 4 times
[info]   at spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:709)
[info]   at spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:707)
[info]   at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:60)
[info]   at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
[info]   at spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:707)
[info]   at spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:352)
[info]   at spark.scheduler.DAGScheduler.spark$scheduler$DAGScheduler$$run(DAGScheduler.scala:414)
[info]   at spark.scheduler.DAGScheduler$$anon$1.run(DAGScheduler.scala:132)
[info]   ...
[info] - repeatedly failing task
[info] - caching
[info] - caching on disk
[info] - caching in memory, replicated
[

Most of the tests have this error, but one has:

[info] DriverSuite:
Exception in thread "main" java.lang.NoClassDefFoundError: spark/DriverWithoutCleanup
Caused by: java.lang.ClassNotFoundException: spark.DriverWithoutCleanup
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
[info] - driver should exit after finishing *** FAILED ***
[info]   SparkException was thrown during property evaluation. (DriverSuite.scala:35)
[info]   Message: Process List(./run, spark.DriverWithoutCleanup, local) exited with code 1
[info]   Occurred at table row 0 (zero based, not counting headings), which had values (
[info]     master = local

I do see DriverWithoutCleanup in the DriverSuite, so not sure why that's failing.

Is this happening for anybody else?

thanks,
Evan

Mark Hamstra

unread,
Jul 20, 2013, 9:43:35 PM7/20/13
to spark-de...@googlegroups.com
What I am seeing recently is spark.repl.ReplSuite hanging on occasion in "interacting with files" (consistently on my Linux machine right now, actually....)  I only see something similar to what you are reporting if I try to use an sbt other than <spark>/sbt/sbt.  There have been changes there recently...

Evan Chan

unread,
Jul 21, 2013, 8:59:10 PM7/21/13
to spark-de...@googlegroups.com
Thanks Mark, that was it.  I was using my own sbt script instead of sbt/sbt.  Now only one test - the DriverSuite - fails, on "scala.util.Try" not found error, I think because it calls the ./run script and it relies on SCALA_HOME to be set up.

-Evan



--
You received this message because you are subscribed to a topic in the Google Groups "Spark Developers" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/spark-developers/GRbAUAy_dO0/unsubscribe.
To unsubscribe from this group and all its topics, send an email to spark-develope...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 



--
--
Evan Chan
Staff Engineer
e...@ooyala.com  | 


Mark Hamstra

unread,
Jul 21, 2013, 9:08:50 PM7/21/13
to spark-de...@googlegroups.com
scala.util.Try doesn't exist in the Scala library before 2.9.3....

Patrick Wendell

unread,
Jul 21, 2013, 9:10:30 PM7/21/13
to spark-de...@googlegroups.com

Yes that issue is probably because Spark now requires 2.9.3 and if you are still on 2.9.2 this error will show up.

---
sent from my phone

You received this message because you are subscribed to the Google Groups "Spark Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-develope...@googlegroups.com.

Evan chan

unread,
Jul 21, 2013, 10:28:30 PM7/21/13
to spark-de...@googlegroups.com, spark-de...@googlegroups.com
Thanks, yes I realize that.  It's too bad the run script requires scala home even tho sbt already has 2.9.3 on class path.  

-Evan
To be free is not merely to cast off one's chains, but to live in a way that respects & enhances the freedom of others. (#NelsonMandela)
Reply all
Reply to author
Forward
0 new messages