I am newbie between Spark and Cassandra.
I have worked with Cassandra and PHP. It is working well.
Now, I would like to work with Spark in this Cassandra. I have tried to connect Spark to Cassandra. I used the line command: bin/spark-submit --jars ~/spark-cassandra-connector-assembly-1.6.0-M2.jar --driver-class-path ~/spark-cassandra-connector-assembly-1.6.0-M2.jar teste_spark_cassandra.py
The spark-cassandra-connector-assembly-1.6.0-M2.jar was obtained at [1].
I received an error message:
py4j.protocol.Py4JJavaError: An error occurred while calling o26.load.
: java.lang.NoSuchMethodError: org.apache.spark.sql.types.DecimalType.<init>(II)V
at org.apache.spark.sql.cassandra.DataTypeConverter$.<init>(DataTypeConverter.scala:29)
at org.apache.spark.sql.cassandra.DataTypeConverter$.<clinit>(DataTypeConverter.scala)
at org.apache.spark.sql.cassandra.CassandraSourceRelation$$anonfun$schema$1$$anonfun$apply$1.apply(CassandraSourceRelation.scala:61)
at org.apache.spark.sql.cassandra.CassandraSourceRelation$$anonfun$schema$1$$anonfun$apply$1.apply(CassandraSourceRelation.scala:61)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at org.apache.spark.sql.cassandra.CassandraSourceRelation$$anonfun$schema$1.apply(CassandraSourceRelation.scala:61)
at org.apache.spark.sql.cassandra.CassandraSourceRelation$$anonfun$schema$1.apply(CassandraSourceRelation.scala:61)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.sql.cassandra.CassandraSourceRelation.schema(CassandraSourceRelation.scala:61)
at org.apache.spark.sql.sources.LogicalRelation.<init>(LogicalRelation.scala:30)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:120)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
at py4j.Gateway.invoke(Gateway.java:259)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:207)
at java.lang.Thread.run(Thread.java:745)
[1] https://www.codementor.io/data-science/tutorial/installing-cassandra-spark-linux-debian-ubuntu-14
--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.
--packages datastax:spark-cassandra-connector:1.6.0-M2-s_2.10
There is no need to specify a Cassandra driver, it is a dependency of the connector
Yes, if you use packages or sbt assembly it will be included automatically