Hi Everyone,
I am trying to use rdd.deleteFromCassandra and getting the below error , I have datastax:spark-cassandra-connector:2.4.2-s_2.12 library installed on my databricks cluster
command-3863946755109700:12: error: object deleteFromCassandra is not a member of package com.datastax.spark.connector.rdd
rdd.deleteFromCassandra(
^
command-3863946755109700:14: error: not found: value writeConf
writeConf = WriteConf(hupExDf.count > 0))
Below is the code I am trying to use ( basically trying to delete number of rows from Cassandra which are saved in a delta table)
import com.datastax.spark.connector._
import com.datastax.spark.connector.writer._
val hupExDf = sqlContext.sql(s"select * from $auditExDbName")
if(hupExDf.count > 0){
val hupExSchema = Seq("msd_code","effective_date","sys_creation_date","sys_update_date","bpr_tier","company_name","expiration_date")
hupExDf.toDF(hupExSchema: _*)
// sc.parallelize(Seq(("animal", "trex"), ("animal", "mammoth")))
// rdd.deleteFromCassandra(cassKeyspace, "hup_corp_msd_reference")
rdd.deleteFromCassandra(
cassKeyspace,"hup_corp_msd_reference",
writeConf = WriteConf(hupExDf.count > 0))
display(hupExDf)
}