java.lang.NoClassDefFoundError: org/apache/spark/sql/Dataset

1,251 views
Skip to first unread message

kant kodali

unread,
Oct 5, 2016, 4:16:49 PM10/5/16
to spark-conn...@lists.datastax.com
Exception in thread "dag-scheduler-event-loop" java.lang.NoClassDefFoundError: org/apache/spark/sql/Dataset
 at java.lang.Class.getDeclaredMethods0(Native Method)
 at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
 at java.lang.Class.getDeclaredMethod(Class.java:2128)
 at java.io.ObjectStreamClass.getPrivateMethod(ObjectStreamClass.java:1475)


Spark Version: 2.0.0
scala version: 2.11.8
Java: 8
compile group'com.datastax.spark'name'spark-cassandra-connector_2.11'version'2.0.0-M3'

My client program in Java.  I am assuming this is a version issue. any idea?

Thanks,

kant

kant kodali

unread,
Oct 7, 2016, 2:09:29 AM10/7/16
to russell...@gmail.com, spark-conn...@lists.datastax.com


---------- Forwarded message ----------
From: kant kodali <kant...@gmail.com>
Date: Wed, Oct 5, 2016 at 1:16 PM
Subject: java.lang.NoClassDefFoundError: org/apache/spark/sql/Dataset
To: spark-conn...@lists.datastax.com


Hi Russell,

Any idea on this one?  I did place this jar under spark_home/jars although I didnt understand why I need to do that? And I included in my project as well but still I get the same exceptions as below.

Thanks!

Russell Spitzer

unread,
Oct 7, 2016, 2:12:16 AM10/7/16
to kant kodali, spark-conn...@lists.datastax.com
Are you not using spark submit? It seems like the spark jars are just not on the classpath if you fixed all the other things previously mentioned?

kant kodali

unread,
Oct 7, 2016, 2:23:56 AM10/7/16
to Russell Spitzer, spark-conn...@lists.datastax.com
Hi Russell,

I never submitted a Job through command line. I use spark standalone mode and I just invoke the following code when I was not using spark cassandra connector and I assumed the following code is going to submit the job because it worked fine. I am just trying to do something similar like the code below except I want to use Spark Cassandra Connector this time.




SparkConf sparkConf = config.buildSparkConfig();
sparkConf.setJars(JavaSparkContext.jarOfClass(SparkDriver.class));
JavaStreamingContext ssc = new JavaStreamingContext(sparkConf, new Duration(config.getSparkStremingBatchInterval()));
ssc.sparkContext().setLogLevel("ERROR");
Receiver receiver = new Receiver(config);
JavaReceiverInputDStream<String> jsonMessagesDStream = ssc.receiverStream(receiver);
jsonMessagesDStream.count()
ssc.start();
ssc.awaitTermination();

Russell Spitzer

unread,
Oct 7, 2016, 2:27:11 AM10/7/16
to spark-conn...@lists.datastax.com, Russell Spitzer

Use spark submit


--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.
--

kant kodali

unread,
Oct 7, 2016, 2:42:02 AM10/7/16
to spark-conn...@lists.datastax.com, Russell Spitzer, spark-conn...@lists.datastax.com, Russell Spitzer
Sure I can try that but is it not possible to do it through API like the code I had before?

Russell Spitzer

unread,
Oct 7, 2016, 2:47:47 AM10/7/16
to spark-conn...@lists.datastax.com, Russell Spitzer

http://spark.apache.org/docs/latest/submitting-applications.html

Spark submit is really the only supported way to launch a distributed spark app. While you can mimic what it does through the API it is most likely that a detail will be missed. For example you have a core spark library not present on the class path. Without knowing exactly how you launch this it would be impossible for me to guess why this is the case. If you use spark submit I know exactly how the dependencies are distributed and the environment and class path are propagated.

kant kodali

unread,
Oct 7, 2016, 3:20:07 AM10/7/16
to spark-conn...@lists.datastax.com, Russell Spitzer, spark-conn...@lists.datastax.com, Russell Spitzer
I would definetly take the approach you are suggesting but due to some time constraints and given the knowledge I have in spark I am trying to take some short cuts so is below stuff useful at all? all I did to generate below was this

gradle dependencies | grep  "spark" 




Would this help at all?
                  
gradle dependencies | grep "spark"
+--- org.apache.spark:spark-core_2.11:2.0.0
| +--- org.apache.spark:spark-launcher_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0
| | | \--- org.spark-project.spark:unused:1.0.0
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-network-common_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-network-shuffle_2.11:2.0.0
| | +--- org.apache.spark:spark-network-common_2.11:2.0.0 (*)
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-unsafe_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| \--- org.spark-project.spark:unused:1.0.0
+--- org.apache.spark:spark-streaming_2.11:2.0.0
| +--- org.apache.spark:spark-core_2.11:2.0.0 (*)
| +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| \--- org.spark-project.spark:unused:1.0.0
+--- com.datastax.spark:spark-cassandra-connector_2.11:2.0.0-M3
+--- org.apache.spark:spark-core_2.11:2.0.0
| +--- org.apache.spark:spark-launcher_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0
| | | \--- org.spark-project.spark:unused:1.0.0
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-network-common_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-network-shuffle_2.11:2.0.0
| | +--- org.apache.spark:spark-network-common_2.11:2.0.0 (*)
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-unsafe_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| \--- org.spark-project.spark:unused:1.0.0
+--- org.apache.spark:spark-streaming_2.11:2.0.0
| +--- org.apache.spark:spark-core_2.11:2.0.0 (*)
| +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| \--- org.spark-project.spark:unused:1.0.0
+--- com.datastax.spark:spark-cassandra-connector_2.11:2.0.0-M3
+--- org.apache.spark:spark-core_2.11:2.0.0
| +--- org.apache.spark:spark-launcher_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0
| | | \--- org.spark-project.spark:unused:1.0.0
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-network-common_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-network-shuffle_2.11:2.0.0
| | +--- org.apache.spark:spark-network-common_2.11:2.0.0 (*)
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-unsafe_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| \--- org.spark-project.spark:unused:1.0.0
+--- org.apache.spark:spark-streaming_2.11:2.0.0
| +--- org.apache.spark:spark-core_2.11:2.0.0 (*)
| +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| \--- org.spark-project.spark:unused:1.0.0
+--- com.datastax.spark:spark-cassandra-connector_2.11:2.0.0-M3
+--- org.apache.spark:spark-core_2.11:2.0.0
| +--- org.apache.spark:spark-launcher_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0
| | | \--- org.spark-project.spark:unused:1.0.0
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-network-common_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-network-shuffle_2.11:2.0.0
| | +--- org.apache.spark:spark-network-common_2.11:2.0.0 (*)
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-unsafe_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| \--- org.spark-project.spark:unused:1.0.0
+--- org.apache.spark:spark-streaming_2.11:2.0.0
| +--- org.apache.spark:spark-core_2.11:2.0.0 (*)
| +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| \--- org.spark-project.spark:unused:1.0.0
+--- com.datastax.spark:spark-cassandra-connector_2.11:2.0.0-M3
+--- org.apache.spark:spark-core_2.11:2.0.0
| +--- org.apache.spark:spark-launcher_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0
| | | \--- org.spark-project.spark:unused:1.0.0
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-network-common_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-network-shuffle_2.11:2.0.0
| | +--- org.apache.spark:spark-network-common_2.11:2.0.0 (*)
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-unsafe_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| \--- org.spark-project.spark:unused:1.0.0
+--- org.apache.spark:spark-streaming_2.11:2.0.0
| +--- org.apache.spark:spark-core_2.11:2.0.0 (*)
| +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| \--- org.spark-project.spark:unused:1.0.0
+--- com.datastax.spark:spark-cassandra-connector_2.11:2.0.0-M3
+--- org.apache.spark:spark-core_2.11:2.0.0
| +--- org.apache.spark:spark-launcher_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0
| | | \--- org.spark-project.spark:unused:1.0.0
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-network-common_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-network-shuffle_2.11:2.0.0
| | +--- org.apache.spark:spark-network-common_2.11:2.0.0 (*)
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-unsafe_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| \--- org.spark-project.spark:unused:1.0.0
+--- org.apache.spark:spark-streaming_2.11:2.0.0
| +--- org.apache.spark:spark-core_2.11:2.0.0 (*)
| +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| \--- org.spark-project.spark:unused:1.0.0
+--- com.datastax.spark:spark-cassandra-connector_2.11:2.0.0-M3
+--- org.apache.spark:spark-core_2.11:2.0.0
| +--- org.apache.spark:spark-launcher_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0
| | | \--- org.spark-project.spark:unused:1.0.0
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-network-common_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-network-shuffle_2.11:2.0.0
| | +--- org.apache.spark:spark-network-common_2.11:2.0.0 (*)
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-unsafe_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| \--- org.spark-project.spark:unused:1.0.0
+--- org.apache.spark:spark-streaming_2.11:2.0.0
| +--- org.apache.spark:spark-core_2.11:2.0.0 (*)
| +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| \--- org.spark-project.spark:unused:1.0.0
+--- com.datastax.spark:spark-cassandra-connector_2.11:2.0.0-M3
+--- org.apache.spark:spark-core_2.11:2.0.0
| +--- org.apache.spark:spark-launcher_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0
| | | \--- org.spark-project.spark:unused:1.0.0
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-network-common_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-network-shuffle_2.11:2.0.0
| | +--- org.apache.spark:spark-network-common_2.11:2.0.0 (*)
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-unsafe_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| \--- org.spark-project.spark:unused:1.0.0
+--- org.apache.spark:spark-streaming_2.11:2.0.0
| +--- org.apache.spark:spark-core_2.11:2.0.0 (*)
| +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| \--- org.spark-project.spark:unused:1.0.0
+--- com.datastax.spark:spark-cassandra-connector_2.11:2.0.0-M3
+--- org.apache.spark:spark-core_2.11:2.0.0
| +--- org.apache.spark:spark-launcher_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0
| | | \--- org.spark-project.spark:unused:1.0.0
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-network-common_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-network-shuffle_2.11:2.0.0
| | +--- org.apache.spark:spark-network-common_2.11:2.0.0 (*)
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-unsafe_2.11:2.0.0
| | +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| | \--- org.spark-project.spark:unused:1.0.0
| +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| \--- org.spark-project.spark:unused:1.0.0
+--- org.apache.spark:spark-streaming_2.11:2.0.0
| +--- org.apache.spark:spark-core_2.11:2.0.0 (*)
| +--- org.apache.spark:spark-tags_2.11:2.0.0 (*)
| \--- org.spark-project.spark:unused:1.0.0
+--- com.datastax.spark:spark-cassandra-connector_2.11
Mixmax Not using Mixmax yet?

kant kodali

unread,
Oct 7, 2016, 3:45:33 AM10/7/16
to spark-conn...@lists.datastax.com, Russell Spitzer, spark-conn...@lists.datastax.com, Russell Spitzer
Look like there is much shorter version. I am just trying to print the  java classpath and here is what I could get after doing  a grep for spark jars.


                  
gradle clean build | grep "spark"
/Users/kantkodali/.gradle/caches/modules-2/files-2.1/org.apache.spark/spark-core_2.11/2.0.0/c4d04336c142f10eb7e172155f022f86b6d11dd3/spark-core_2.11-2.0.0.jar
/Users/kantkodali/.gradle/caches/modules-2/files-2.1/org.apache.spark/spark-streaming_2.11/2.0.0/7227cbd39f5952b0ed3579bc78463bcc318ecd2b/spark-streaming_2.11-2.0.0.jar
/Users/kantkodali/.gradle/caches/modules-2/files-2.1/com.datastax.spark/spark-cassandra-connector_2.11/2.0.0-M3/d38ac36dde076e3364f1024985754bce84bd39d/spark-cassandra-connector_2.11-2.0.0-M3.jar
/Users/kantkodali/.gradle/caches/modules-2/files-2.1/org.apache.spark/spark-launcher_2.11/2.0.0/9c3e1bd84ccb099e86ea232f5acd8fec1a61e291/spark-launcher_2.11-2.0.0.jar
/Users/kantkodali/.gradle/caches/modules-2/files-2.1/org.apache.spark/spark-network-common_2.11/2.0.0/b451dae899ee8138e96319528eed64f7e849dbe2/spark-network-common_2.11-2.0.0.jar
/Users/kantkodali/.gradle/caches/modules-2/files-2.1/org.apache.spark/spark-network-shuffle_2.11/2.0.0/233c036e88761424212508b2a6a55633a3cf4ec8/spark-network-shuffle_2.11-2.0.0.jar
/Users/kantkodali/.gradle/caches/modules-2/files-2.1/org.apache.spark/spark-unsafe_2.11/2.0.0/9f8682d4c83ce32f08fea067c2e22aaabca27d86/spark-unsafe_2.11-2.0.0.jar
/Users/kantkodali/.gradle/caches/modules-2/files-2.1/org.apache.spark/spark-tags_2.11/2.0.0/7f84a46b1e60c1981e47cae05c462fed65217eff/spark-tags_2.11-2.0.0.jar
/Users/kantkodali/.gradle/caches/modules-2/files-2.1/org.spark-project.spark/unused/1.0.0/205fe37a2fade6ce6dfcf8eff57ed21a4a1c22af/unused-1.0.0.jar

kant kodali

unread,
Oct 7, 2016, 3:51:55 AM10/7/16
to spark-conn...@lists.datastax.com, Russell Spitzer, spark-conn...@lists.datastax.com, Russell Spitzer

                  
/Users/kantkodali/.gradle/caches/modules-2/files-2.1/org.apache.spark/spark-core_2.11/2.0.0/c4d04336c142f10eb7e172155f022f86b6d11dd3/spark-core_2.11-2.0.0.jar
/Users/kantkodali/.gradle/caches/modules-2/files-2.1/org.apache.spark/spark-streaming_2.11/2.0.0/7227cbd39f5952b0ed3579bc78463bcc318ecd2b/spark-streaming_2.11-2.0.0.jar
/Users/kantkodali/.gradle/caches/modules-2/files-2.1/com.datastax.spark/spark-cassandra-connector_2.11/2.0.0-M3/d38ac36dde076e3364f1024985754bce84bd39d/spark-cassandra-connector_2.11-2.0.0-M3.jar
/Users/kantkodali/.gradle/caches/modules-2/files-2.1/org.apache.spark/spark-launcher_2.11/2.0.0/9c3e1bd84ccb099e86ea232f5acd8fec1a61e291/spark-launcher_2.11-2.0.0.jar
/Users/kantkodali/.gradle/caches/modules-2/files-2.1/org.apache.spark/spark-network-common_2.11/2.0.0/b451dae899ee8138e96319528eed64f7e849dbe2/spark-network-common_2.11-2.0.0.jar
/Users/kantkodali/.gradle/caches/modules-2/files-2.1/org.apache.spark/spark-network-shuffle_2.11/2.0.0/233c036e88761424212508b2a6a55633a3cf4ec8/spark-network-shuffle_2.11-2.0.0.jar
/Users/kantkodali/.gradle/caches/modules-2/files-2.1/org.apache.spark/spark-unsafe_2.11/2.0.0/9f8682d4c83ce32f08fea067c2e22aaabca27d86/spark-unsafe_2.11-2.0.0.jar
/Users/kantkodali/.gradle/caches/modules-2/files-2.1/org.apache.spark/spark-tags_2.11/2.0.0/7f84a46b1e60c1981e47cae05c462fed65217eff/spark-tags_2.11-2.0.0.jar
/Users/kantkodali/.gradle/caches/modules-2/files-2.1/org.spark-project.spark/unused/1.0.0/205fe37a2fade6ce6dfcf8eff57ed21a4a1c22af/unused-1.0.0.jar
Mixmax Not using Mixmax yet?

kant kodali

unread,
Oct 7, 2016, 4:07:28 AM10/7/16
to spark-conn...@lists.datastax.com, Russell Spitzer, spark-conn...@lists.datastax.com, Russell Spitzer
Hi! I did realize my previous two emails didn't come out right (The Text got mixed up). so I am pasting it here

org.apache.spark/spark-core_2.11/2.0.0/c4d04336c142f10eb7e172155f022f86b6d11dd3/spark-core_2.11-2.0.0.jar

org.apache.spark/sparkstreaming_2.11/2.0.0/7227cbd39f5952b0ed3579bc78463bcc318ecd2b/spark-streaming_2.11-2.0.0.jar

com.datastax.spark/spark-cassandra-connector_2.11/2.0.0-M3/d38ac36dde076e3364f1024985754bce84bd39d/spark-cassandra-connector_2.11-2.0.0-M3.jar

org.apache.spark/spark-launcher_2.11/2.0.0/9c3e1bd84ccb099e86ea232f5acd8fec1a61e291/spark-launcher_2.11-2.0.0.jar

org.apache.spark/spark-network-common_2.11/2.0.0/b451dae899ee8138e96319528eed64f7e849dbe2/spark-network-common_2.11-2.0.0.jar

org.apache.spark/spark-network-shuffle_2.11/2.0.0/233c036e88761424212508b2a6a55633a3cf4ec8/spark-network-shuffle_2.11-2.0.0.jar

org.apache.spark/spark-unsafe_2.11/2.0.0/9f8682d4c83ce32f08fea067c2e22aaabca27d86/spark-unsafe_2.11-2.0.0.jar

org.apache.spark/spark-tags_2.11/2.0.0/7f84a46b1e60c1981e47cae05c462fed65217eff/spark-tags_2.11-2.0.0.jar

org.spark-project.spark/unused/1.0.0/205fe37a2fade6ce6dfcf8eff57ed21a4a1c22af/unused-1.0.0.jar  

kant kodali

unread,
Oct 7, 2016, 5:04:30 AM10/7/16
to spark-conn...@lists.datastax.com, Russell Spitzer, spark-conn...@lists.datastax.com, Russell Spitzer
Hi Russell,

Thank for the advice. someone in the spark user group helped me. I was missing spark-sql. I didn't realize that is needed for spark-cassandra-connector.

Thanks!
Reply all
Reply to author
Forward
0 new messages