issue with guava

536 views
Skip to first unread message

vincent gromakowski

unread,
Feb 19, 2016, 9:21:08 AM2/19/16
to spark-conn...@lists.datastax.com
Hi all,
I still get the same error when using spark-cassandra connector in Zeppelin. I put the guava 19.0 jar in the classpath but half time I start spark interpreter I get this error. The jar is in the classpath entries in Spark UI but I suppose it sometimes takes the guava lib within Spark which is 14.0.1:


This is the error I get (half of the time only). If restart the interpreter,it's ok.
java.lang.ExceptionInInitializerError
	at com.datastax.spark.connector.cql.DefaultConnectionFactory$.clusterBuilder(CassandraConnectionFactory.scala:35)
	at com.datastax.spark.connector.cql.DefaultConnectionFactory$.createCluster(CassandraConnectionFactory.scala:87)
	at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:153)
	at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:148)
	at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:148)
	at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
	at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)
	at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81)
	at com.datastax.spark.connector.rdd.CassandraTableScanRDD.compute(CassandraTableScanRDD.scala:218)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
	at org.apache.spark.scheduler.Task.run(Task.scala:89)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalStateException: Detected Guava issue #1635 which indicates that a version of Guava less than 16.01 is in use.  This introduces codec resolution issues and potentially other incompatibility issues in the driver.  Please upgrade to Guava 16.01 or later.
	at com.datastax.driver.core.SanityChecks.checkGuava(SanityChecks.java:62)
	at com.datastax.driver.core.SanityChecks.check(SanityChecks.java:36)
	at com.datastax.driver.core.Cluster.<clinit>(Cluster.java:67)
	... 17 more

Is the solution to build a fat jar with SBT assembly which would bring guava 16.0.1 ? If yes I get invalid signatures building master branch... 

Eric Meisel

unread,
Feb 19, 2016, 9:22:35 AM2/19/16
to spark-conn...@lists.datastax.com
Can you share your build.sbt file that you're using for the assembly?

--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.

vincent gromakowski

unread,
Feb 19, 2016, 9:24:15 AM2/19/16
to spark-conn...@lists.datastax.com
From master branch of the git
sbt -Dscala-2.11=true assembly

Eric Meisel

unread,
Feb 19, 2016, 9:27:16 AM2/19/16
to spark-conn...@lists.datastax.com
Oh - I was referring to your specific project. You can create a build.sbt file and pull in a maven release for your dependencies. Here's an example of a project that does this:

https://github.com/databricks/learning-spark/blob/master/build.sbt 

I'll try building from the master too, see I get errors.

vincent gromakowski

unread,
Feb 19, 2016, 9:28:36 AM2/19/16
to spark-conn...@lists.datastax.com
I am using zeppelin with manual jar loading. No SBT

Eric Meisel

unread,
Feb 19, 2016, 9:30:35 AM2/19/16
to spark-conn...@lists.datastax.com
Ah, sorry, you said that and I missed it. 

vincent gromakowski

unread,
Feb 19, 2016, 9:41:16 AM2/19/16
to spark-conn...@lists.datastax.com
this is what I load from Zeppelin 
....../spark-cassandra-connector_2.10-1.5.0-RC1.jar
....../com.google.guava.guava-19.0.jar
....../cassandra-driver-core-3.0.0-rc1.jar

Eric Meisel

unread,
Feb 19, 2016, 9:45:50 AM2/19/16
to spark-conn...@lists.datastax.com
My sbt/sbt -Dscala_2.11=true assembly command worked locally. 

Can you share the errors you got during the assembly?

It also looks like you're assembling a scala 2.11 version from master, but referencing scala 2.10 in your Zeppelin load.

vincent gromakowski

unread,
Feb 19, 2016, 9:49:25 AM2/19/16
to spark-conn...@lists.datastax.com
No sorry I have tried to build 2 versions,one for scala 2.10 and one for 2.11

vincent gromakowski

unread,
Feb 19, 2016, 9:50:35 AM2/19/16
to spark-conn...@lists.datastax.com
this is the error I get when starting Spark with the assembly build

Exception in thread "main" java.lang.SecurityException: Invalid signature file digest for Manifest main attributes
        at sun.security.util.SignatureFileVerifier.processImpl(SignatureFileVerifier.java:286)
        at sun.security.util.SignatureFileVerifier.process(SignatureFileVerifier.java:239)
        at java.util.jar.JarVerifier.processEntry(JarVerifier.java:274)
        at java.util.jar.JarVerifier.update(JarVerifier.java:228)
        at java.util.jar.JarFile.initializeVerifier(JarFile.java:348)
        at java.util.jar.JarFile.getInputStream(JarFile.java:415)
        at sun.misc.JarIndex.getJarIndex(JarIndex.java:137)
        at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:674)
        at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:666)
        at java.security.AccessController.doPrivileged(Native Method)
        at sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:665)
        at sun.misc.URLClassPath$JarLoader.<init>(URLClassPath.java:638)
Reply all
Reply to author
Forward
0 new messages