Scala signature package has wrong version expected: 5.0 found: 45.0 in scala.package after Export?

1,234 views
Skip to first unread message

denis....@googlemail.com

unread,
Oct 27, 2015, 8:06:56 AM10/27/15
to Scala IDE User
Hi,
 
I have created a small Scala Program that queries some data using JDBC driver.
In the IDE and with different JVMs (1.7, 1.8, etc.) invoking the Scala Program from the command line works great.
 
But on one of the target systems I get the following error and I don't know if thats a problem with my code, the JVM implementation or spark or scala, attached is the exception, and ideas how to fix that?
 
 java.lang.ExceptionInInitializerError
  at java.lang.J9VMInternals.ensureError(J9VMInternals.java:134)
  at java.lang.J9VMInternals.recordInitializationFailure(J9VMInternals.java:123)
  at org.apache.spark.sql.jdbc.JDBCRDD$.org$apache$spark$sql$jdbc$JDBCRDD$$getCatalystType(JDBCRDD.scala:62)
  at org.apache.spark.sql.jdbc.JDBCRDD$$anonfun$1.apply(JDBCRDD.scala:137)
  at org.apache.spark.sql.jdbc.JDBCRDD$$anonfun$1.apply(JDBCRDD.scala:137)
  at scala.Option.getOrElse(Option.scala:120)
  at org.apache.spark.sql.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:136)
  at org.apache.spark.sql.jdbc.JDBCRelation.<init>(JDBCRelation.scala:128)
  at org.apache.spark.sql.jdbc.DefaultSource.createRelation(JDBCRelation.scala:113)
  at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:269)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
  at com.ibm.imstest.IMSAccessSample$delayedInit$body.apply(IMSAccessSample.scala:26)
  at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
  at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
  at scala.App$$anonfun$main$1.apply(App.scala:71)
  at scala.App$$anonfun$main$1.apply(App.scala:71)
  at scala.collection.immutable.List.foreach(List.scala:318)
  at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
  at scala.App$class.main(App.scala:71)
  at com.ibm.imstest.IMSAccessSample$.main(IMSAccessSample.scala:8)
  at com.ibm.imstest.IMSAccessSample.main(IMSAccessSample.scala)
 Caused by: scala.reflect.internal.MissingRequirementError: error while loading package, Scala signature package has
 wrong version
  expected: 5.0
  found: 45.0 in scala.package
  at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
  at scala.reflect.runtime.JavaMirrors$JavaMirror.handleError$1(JavaMirrors.scala:535)
  at scala.reflect.runtime.JavaMirrors$JavaMirror.unpickleClass(JavaMirrors.scala:584)
  at scala.reflect.runtime.SymbolLoaders$TopClassCompleter.complete(SymbolLoaders.scala:32)
  at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1231)
  at scala.reflect.internal.SymbolTable.openPackageModule(SymbolTable.scala:244)
  at scala.reflect.internal.SymbolTable.openPackageModule(SymbolTable.scala:300)
  at scala.reflect.runtime.SymbolLoaders$LazyPackageType.complete(SymbolLoaders.scala:89)
  at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1231)
  at scala.reflect.internal.Definitions$DefinitionsClass.AnyValClass$lzycompute(Definitions.scala:275)
  at scala.reflect.internal.Definitions$DefinitionsClass.AnyValClass(Definitions.scala:275)
  at scala.reflect.runtime.JavaMirrors$class.init(JavaMirrors.scala:50)
  at scala.reflect.runtime.JavaUniverse.init(JavaUniverse.scala:12)
  at scala.reflect.runtime.JavaUniverse.<init>(JavaUniverse.scala:26)
  at scala.reflect.runtime.package$.universe$lzycompute(package.scala:16)
  at scala.reflect.runtime.package$.universe(package.scala:16)
  at org.apache.spark.sql.types.AtomicType.<init>(DataType.scala:95)
  at org.apache.spark.sql.types.StringType.<init>(StringType.scala:33)
  at org.apache.spark.sql.types.StringType$.<init>(StringType.scala:49)
  at org.apache.spark.sql.types.StringType$.<clinit>(StringType.scala)
  ... 19 more
 
Thanks, Denis.

iulian dragos

unread,
Oct 27, 2015, 11:48:10 AM10/27/15
to scala-i...@googlegroups.com
That looks like a strange version number, pointing to a corrupt or extremely old scala library... I couldn't find that version anywhere though, so I'm a bit puzzled. What kind of environment is that?

--
You received this message because you are subscribed to the Google Groups "Scala IDE User" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-ide-use...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/scala-ide-user/e860ac2c-4722-4e47-a064-0b258bd536ce%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.



--
« Je déteste la montagne, ça cache le paysage »
Alphonse Allais
Reply all
Reply to author
Forward
0 new messages