I have to use cassandra java driver at the same time, so I choose the unshaded version. After pinning cassandra java driver to 3.6.2, I have tried out several versions of shaded versions but couldn't find a compatible one.
When running with spark-cassandra-connector_2.11_2.5.1, it will throw an exception
Caused by: com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2017-02-01T08:00:00Z of type class java.time.Instant to java.sql.Date.
while in the source code I did define it to be java.sql.Data : private val buildDate = Date.valueOf("2017-02-01")
unshaded_2.4.3 is the only version I could use so far, but as said, I couldn't make it work with the newer guava. I have tried shaded_2.4.3, it will throw Exception
java.lang.NoSuchMethodError: com.datastax.driver.core.TypeCodec.<init>(Lcom/datastax/driver/core/DataType;Lcom/google/common/reflect/TypeToken;)V
org.apache.spark.SparkException: Job aborted due to stage failure: Task 11 in stage 0.0 failed 1 times, most recent failure: Lost task 11.0 in stage 0.0 (TID 11, localhost, executor driver): com.datastax.spark.connector.types.TypeConversionException: Failed to convert column build_start_time of geocache.access_points_v3 to java.sql.Date: 2017-02-01T08:00:00Z
at com.datastax.spark.connector.mapper.GettableDataToMappedTypeConverter.tryConvert(GettableDataToMappedTypeConverter.scala:134)
at com.datastax.spark.connector.mapper.GettableDataToMappedTypeConverter.convertedColumnValue(GettableDataToMappedTypeConverter.scala:160)
at
com.datastax.spark.connector.mapper.GettableDataToMappedTypeConverter.com$datastax$spark$connector$mapper$GettableDataToMappedTypeConverter$$ctorParamValue(GettableDataToMappedTypeConverter.scala:191)
at com.datastax.spark.connector.mapper.GettableDataToMappedTypeConverter$$anonfun$com$datastax$spark$connector$mapper$GettableDataToMappedTypeConverter$$fillBuffer$1.apply$mcVI$sp(GettableDataToMappedTypeConverter.scala:223)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
at
com.datastax.spark.connector.mapper.GettableDataToMappedTypeConverter.com$datastax$spark$connector$mapper$GettableDataToMappedTypeConverter$$fillBuffer(GettableDataToMappedTypeConverter.scala:222)
at com.datastax.spark.connector.mapper.GettableDataToMappedTypeConverter$$anonfun$convertPF$1.applyOrElse(GettableDataToMappedTypeConverter.scala:260)
at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:44)
at com.datastax.spark.connector.mapper.GettableDataToMappedTypeConverter.convert(GettableDataToMappedTypeConverter.scala:19)
at com.datastax.spark.connector.rdd.reader.ClassBasedRowReader.read(ClassBasedRowReader.scala:33)
at com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$15.apply(CassandraTableScanRDD.scala:347)
at com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$15.apply(CassandraTableScanRDD.scala:347)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:410)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:445)
at com.datastax.spark.connector.util.CountingIterator.next(CountingIterator.scala:16)
at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:463)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:125)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2017-02-01T08:00:00Z of type class java.time.Instant to java.sql.Date.
at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:46)
at com.datastax.spark.connector.types.TypeConverter$SqlDateConverter$$anonfun$convertPF$14.applyOrElse(TypeConverter.scala:323)
at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:44)
at com.datastax.spark.connector.types.TypeConverter$SqlDateConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:320)
at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:57)
at com.datastax.spark.connector.types.TypeConverter$SqlDateConverter$.convert(TypeConverter.scala:320)
at com.datastax.spark.connector.mapper.GettableDataToMappedTypeConverter.tryConvert(GettableDataToMappedTypeConverter.scala:131)
... 26 more