Hi,
I am trying to use Spark's newAPIHadoopRDD function to read data from Cassandra database similar to the example in CassandraTest.scala.
The code is the following:
JavaPairRDD<ByteBuffer,SortedMap<ByteBuffer, IColumn>> casRdd = sc.newAPIHadoopRDD(
job.getConfiguration(),
ColumnFamilyInputFormat.class,ByteBuffer.class
Class.forName("java.util.SortedMap"));
However, I get the error Type mismatch: cannot convert from JavaPairRDD<ByteBuffer,capture#2-of ?> to JavaPairRDD<ByteBuffer,SortedMap<ByteBuffer,IColumn>>
I have also tried SortedMap<ByteBuffer,IColumn>.class, which does not work either.
Could someone help with this?
Thanks!