Issues with Spark shell vs Scala

Showing 1-3 of 3 messages
Issues with Spark shell vs Scala Ankit 1/23/13 5:30 PM
Hey guys,

Running the following code:

trait Foo
class A extends Foo
class B extends Foo
val x = List[Foo](new A(), new B())

works fine in the scala interpreter, but fails in the Spark shell with the following error:

scala> List[Foo](new A(), new B())
<console>:20: error: type mismatch;
 found   : this.A
 required: this.Foo
       List[Foo](new A(), new B())
                 ^
<console>:20: error: type mismatch;
 found   : this.B
 required: this.Foo
       List[Foo](new A(), new B())
                          ^

Any ideas why this is the case?

Re: Issues with Spark shell vs Scala Matei Zaharia 1/23/13 5:37 PM
Yup, unfortunately we don't support traits and classes defined in the interpreter very well. It would be better to compile these into a JAR and add that on your classpath when you launch the shell (add it to the SPARK_CLASSPATH environment variable).

This is happening because we had to slightly change the way the interpreter compiles each line of code typed in to capture dependencies between lines, and all these classes became inner classes.

Matei
Re: Issues with Spark shell vs Scala Ankit 1/23/13 5:49 PM
Oh, okay, that makes more sense, thanks!