Exception in thread "main" java.lang.ExceptionInInitializerError
at com.datastax.spark.connector.cql.DefaultConnectionFactory$.clusterBuilder(CassandraConnectionFact35)
....
...
...
..
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.IllegalStateException: Detected Guava issue #1635 which indicates that a version of Guaan 16.01 is in use. This introduces codec resolution issues and potentially other incompatibility issues inr. Please upgrade to Guava 16.01 or later.
at com.datastax.driver.core.SanityChecks.checkGuava(SanityChecks.java:62)
at com.datastax.driver.core.SanityChecks.check(SanityChecks.java:36)
at com.datastax.driver.core.Cluster.<clinit>(Cluster.java:67)
Versions -
**************
Cassandra - 2.2.4 on cluster
Spark -1.5.2 on cluster
Kafka - kafka_2.10-0.8.2.1
Spark cassandra driver -
spark submit -
*****************
parag@diotist01:~/dev/scripts/test1$ spark-submit --class "SimpleApp" --master spark://10.88.23.13:7077 --jars lib/spark-streaming-kafka_2.10-1.5.2.jar,lib/kafka_2.10-0.8.2.1.jar,lib/kafka-clients-0.8.2.1.jar,lib/metrics-core-2.2.0.jar,lib/spark-cassandra-connector-assembly-1.5.0.jar target/scala-2.10/test1_2.10-1.0.jar
.
fat jar was created for driver
Please guide how to resolve this issue
Changed your project/assembly.sbt to look like below line so that you use Version 0.14 of sbt:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.2")
Change your build.sbt to shade com.google.guava (need sbt 0.14 for this) and then have a merge strategy that takes the latest of each.
Add this to the end of your build.sbt (not sure if it matters where):
// There is a conflict between Guava versions on Cassandra Drive and Cassandra
// Core
// Shading Guava Package
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("com.google.**" -> "shadeio.@1").inAll
)
mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
{
case PathList("META-INF", "MANIFEST.MF") => MergeStrategy.discard
case _ => MergeStrategy.last
--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.
| From: Russell Spitzer Sent: Friday, March 11, 2016 5:53 PM To: DataStax Spark Connector for Apache Cassandra Reply To: spark-conn...@lists.datastax.com Subject: Re: Exception in thread "main" java.lang.ExceptionInInitializerError at com.datastax.spark.connector.cql.DefaultConnectionFactory$.clusterBuilder(CassandraConnectionFact35) |
Thanks Matt, I made the suggested changes, I am still getting same errors.
My build.sbt
**************
name := "test1"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.5.2" % "provided",
"org.apache.spark" %% "spark-streaming" % "1.5.2" % "provided"
)
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("com.google.**" -> "shadeio.@1").inAll
)
mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
{
case PathList("META-INF", "MANIFEST.MF") => MergeStrategy.discard
case _ => MergeStrategy.last
}
}
assembly.sbt
**************
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.2")
Please guide
Parag
Hello
I have in my lib folder depended jars -
kafka_2.10-0.8.2.1.jar spark-cassandra-connector-assembly-1.5.0.jar
kafka-clients-0.8.2.1.jar spark-cassandra-connector-java-assembly-1.5.0.jar
metrics-core-2.2.0.jar
My build.sbt as follows -
name := "test1"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.5.2" % "provided",
"org.apache.spark" %% "spark-streaming" % "1.5.2" % "provided",
("org.apache.spark" % "spark-streaming-kafka_2.10" % "1.5.2").
exclude("org.apache.spark","spark-core_2.10")
)
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("com.google.**" -> "shadeio.@1").inAll
)
mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
{
case PathList("META-INF", "MANIFEST.MF") => MergeStrategy.discard
case _ => MergeStrategy.last
}
}
I am giving command sbt assembly to create jar
and then following spark-submit command -
spark-submit --class "SimpleApp" --master spark://10.88.23.13:7077 target/scala-2.10/test1-assembly-1.0.jar
I still get same errors related to guava .
Caused by: java.lang.IllegalStateException: Detected Guava issue #1635 which indicates that a version of Guava less than 16.01 is in use. This introduces codec resolution issues and potentially other incompatibility issues in the driver. Please upgrade to Guava 16.01 or later.
When i remove "provided" from build.sbt i get
Exception in thread "main" java.lang.SecurityException: Invalid signature file digest for Manifest main attributes at sun.security.util.SignatureFileVerifier.processImpl(SignatureFileVerifier.java:287)
****************
Please guide me what mistake I am doing
Thanks
Parag
| From: matty...@gmail.com Sent: Wednesday, March 30, 2016 8:19 AM To: Dillon Peng Subject: Fw: Exception in thread "main" java.lang.ExceptionInInitializerError at com.datastax.spark.connector.cql.DefaultConnectionFactory$.clusterBuilder(CassandraConnectionFact35) |
| From: matty...@gmail.com Sent: Saturday, March 26, 2016 8:41 AM To: Dillon Peng Subject: Fw: Exception in thread "main" java.lang.ExceptionInInitializerError at com.datastax.spark.connector.cql.DefaultConnectionFactory$.clusterBuilder(CassandraConnectionFact35) |