Issues with Spark shell vs Scala

瀏覽次數:48 次
跳到第一則未讀訊息

asha...@palantir.com

未讀,
2013年1月23日 晚上8:30:102013/1/23
收件者:spark...@googlegroups.com
Hey guys,

Running the following code:

trait Foo
class A extends Foo
class B extends Foo
val x = List[Foo](new A(), new B())

works fine in the scala interpreter, but fails in the Spark shell with the following error:

scala> List[Foo](new A(), new B())
<console>:20: error: type mismatch;
 found   : this.A
 required: this.Foo
       List[Foo](new A(), new B())
                 ^
<console>:20: error: type mismatch;
 found   : this.B
 required: this.Foo
       List[Foo](new A(), new B())
                          ^

Any ideas why this is the case?

Matei Zaharia

未讀,
2013年1月23日 晚上8:37:492013/1/23
收件者:spark...@googlegroups.com
Yup, unfortunately we don't support traits and classes defined in the interpreter very well. It would be better to compile these into a JAR and add that on your classpath when you launch the shell (add it to the SPARK_CLASSPATH environment variable).

This is happening because we had to slightly change the way the interpreter compiles each line of code typed in to capture dependencies between lines, and all these classes became inner classes.

Matei

Ankit

未讀,
2013年1月23日 晚上8:49:282013/1/23
收件者:spark...@googlegroups.com
Oh, okay, that makes more sense, thanks!
回覆所有人
回覆作者
轉寄
0 則新訊息