Issues with Spark shell vs Scala

48 views
Skip to first unread message

asha...@palantir.com

unread,
Jan 23, 2013, 8:30:10 PM1/23/13
to spark...@googlegroups.com
Hey guys,

Running the following code:

trait Foo
class A extends Foo
class B extends Foo
val x = List[Foo](new A(), new B())

works fine in the scala interpreter, but fails in the Spark shell with the following error:

scala> List[Foo](new A(), new B())
<console>:20: error: type mismatch;
 found   : this.A
 required: this.Foo
       List[Foo](new A(), new B())
                 ^
<console>:20: error: type mismatch;
 found   : this.B
 required: this.Foo
       List[Foo](new A(), new B())
                          ^

Any ideas why this is the case?

Matei Zaharia

unread,
Jan 23, 2013, 8:37:49 PM1/23/13
to spark...@googlegroups.com
Yup, unfortunately we don't support traits and classes defined in the interpreter very well. It would be better to compile these into a JAR and add that on your classpath when you launch the shell (add it to the SPARK_CLASSPATH environment variable).

This is happening because we had to slightly change the way the interpreter compiles each line of code typed in to capture dependencies between lines, and all these classes became inner classes.

Matei

Ankit

unread,
Jan 23, 2013, 8:49:28 PM1/23/13
to spark...@googlegroups.com
Oh, okay, that makes more sense, thanks!
Reply all
Reply to author
Forward
0 new messages