new Spark Context java.lang.NoSuchMethodError

1,579 views
Skip to first unread message

Eric Kimbrel

unread,
Oct 23, 2012, 1:16:57 PM10/23/12
to spark...@googlegroups.com
i've recently updated to the latest code on spark hub and can no longer create a new SparkContext

using SNAPSHOT-0.7.0

 sc = new SparkContext("local","some_name")


java.lang.NoSuchMethodError: scala.Predef$.augmentString(Ljava/lang/String;)Lscala/collection/immutable/StringOps;
at spark.SparkContext.<init>(SparkContext.scala:101)
at spark.SparkContext.<init>(SparkContext.scala:83)
at com.pfl.bigdata.MapJoinCorrelate$.main(OptimizedCorrelator.scala:28)
at com.pfl.bigdata.MapJoinCorrelate.main(OptimizedCorrelator.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at scala.tools.nsc.util.ScalaClassLoader$$anonfun$run$1.apply(ScalaClassLoader.scala:71)
at scala.tools.nsc.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
at scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:139)
at scala.tools.nsc.util.ScalaClassLoader$class.run(ScalaClassLoader.scala:71)
at scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:139)
at scala.tools.nsc.CommonRunner$class.run(ObjectRunner.scala:28)
at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:45)
at scala.tools.nsc.CommonRunner$class.runAndCatch(ObjectRunner.scala:35)
at scala.tools.nsc.ObjectRunner$.runAndCatch(ObjectRunner.scala:45)
at scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:74)
at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:96)
at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:105)
at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)

Matei Zaharia

unread,
Oct 23, 2012, 1:20:07 PM10/23/12
to spark...@googlegroups.com
It sounds like you're using a different version of Scala, or have somehow not linked to Spark correctly. However, don't bother trying to use the master branch. There's nothing new in it except a version number right now. Stick with the 0.6 release from http://www.spark-project.org/downloads.html.

Matei

Eric Kimbrel

unread,
Oct 23, 2012, 5:11:08 PM10/23/12
to spark...@googlegroups.com
looks like i was using scala 2.10 instead 2.9.2  I'll see if that will fix it. thanks.
Reply all
Reply to author
Forward
0 new messages