Scala signature package has wrong version expected: 5.0 found: 45.0 in scala.package

119 views
Skip to first unread message

dg

unread,
Oct 27, 2015, 6:58:13 AM10/27/15
to scala-internals
Hi,
 
I have created a simple Scala/Spark program that queries a database using SQL and exported it to a jar file.
When I run the program locally on Windows 7 with different JVMs (1.7, 1.8) it works fine.
However on another platform I get the following error and do not know if I have to open a Problem with the JVM provider, look into Scala or Spark?
 
The stacktrace is as follows:
 java.lang.ExceptionInInitializerError
  at java.lang.J9VMInternals.ensureError(J9VMInternals.java:134)
  at java.lang.J9VMInternals.recordInitializationFailure(J9VMInternals.java:123)
  at org.apache.spark.sql.jdbc.JDBCRDD$.org$apache$spark$sql$jdbc$JDBCRDD$$getCatalystType(JDBCRDD.scala:62)
  at org.apache.spark.sql.jdbc.JDBCRDD$$anonfun$1.apply(JDBCRDD.scala:137)
  at org.apache.spark.sql.jdbc.JDBCRDD$$anonfun$1.apply(JDBCRDD.scala:137)
  at scala.Option.getOrElse(Option.scala:120)
  at org.apache.spark.sql.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:136)
  at org.apache.spark.sql.jdbc.JDBCRelation.<init>(JDBCRelation.scala:128)
  at org.apache.spark.sql.jdbc.DefaultSource.createRelation(JDBCRelation.scala:113)
  at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:269)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
  at com.ibm.imstest.IMSAccessSample$delayedInit$body.apply(IMSAccessSample.scala:26)
  at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
  at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
  at scala.App$$anonfun$main$1.apply(App.scala:71)
  at scala.App$$anonfun$main$1.apply(App.scala:71)
  at scala.collection.immutable.List.foreach(List.scala:318)
  at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
  at scala.App$class.main(App.scala:71)
  at com.ibm.imstest.IMSAccessSample$.main(IMSAccessSample.scala:8)
  at com.ibm.imstest.IMSAccessSample.main(IMSAccessSample.scala)
 Caused by: scala.reflect.internal.MissingRequirementError: error while loading package, Scala signature package has
 wrong version
  expected: 5.0
  found: 45.0 in scala.package
  at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
  at scala.reflect.runtime.JavaMirrors$JavaMirror.handleError$1(JavaMirrors.scala:535)
  at scala.reflect.runtime.JavaMirrors$JavaMirror.unpickleClass(JavaMirrors.scala:584)
  at scala.reflect.runtime.SymbolLoaders$TopClassCompleter.complete(SymbolLoaders.scala:32)
  at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1231)
  at scala.reflect.internal.SymbolTable.openPackageModule(SymbolTable.scala:244)
  at scala.reflect.internal.SymbolTable.openPackageModule(SymbolTable.scala:300)
  at scala.reflect.runtime.SymbolLoaders$LazyPackageType.complete(SymbolLoaders.scala:89)
  at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1231)
  at scala.reflect.internal.Definitions$DefinitionsClass.AnyValClass$lzycompute(Definitions.scala:275)
  at scala.reflect.internal.Definitions$DefinitionsClass.AnyValClass(Definitions.scala:275)
  at scala.reflect.runtime.JavaMirrors$class.init(JavaMirrors.scala:50)
  at scala.reflect.runtime.JavaUniverse.init(JavaUniverse.scala:12)
  at scala.reflect.runtime.JavaUniverse.<init>(JavaUniverse.scala:26)
  at scala.reflect.runtime.package$.universe$lzycompute(package.scala:16)
  at scala.reflect.runtime.package$.universe(package.scala:16)
  at org.apache.spark.sql.types.AtomicType.<init>(DataType.scala:95)
  at org.apache.spark.sql.types.StringType.<init>(StringType.scala:33)
  at org.apache.spark.sql.types.StringType$.<init>(StringType.scala:49)
  at org.apache.spark.sql.types.StringType$.<clinit>(StringType.scala)
  ... 19 more
Thanks for any ideas how to persue/fix this error.
 
The error occurs on the first load:
package com.ibm.imstest
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext
object IMSAccessSample extends App {
val conf = new SparkConf()
.setMaster("local[1]")
.setAppName("GetStokStat")
.set("spark.executor.memory", "1g")
 
val sc = new SparkContext(conf)
val sqlContext = new SQLContext(sc)
val optionsStokStat = scala.collection.mutable.Map[String, String]();
optionsStokStat.put("driver", "com.ibm.ims.jdbc.IMSDriver");
optionsStokStat.put("url", "jdbc:ims://10.1.1.1:5559:user=user;password=password;");
optionsStokStat.put("dbtable", "STOKSTAT");
val stokStat = sqlContext.read.format("jdbc").options(optionsStokStat).load();   
 
Thanks, Denis.
Reply all
Reply to author
Forward
0 new messages