Now that that's out of the way, is there a way to delete records using the Spark Cassandra connector? Or do you need to use the Java driver and execute "DELETE" statements directly?
Of course deleting records from Cassandra is discouraged because it adds overhead, and you should try to minimize deletion whenever possible.
Now that that's out of the way, is there a way to delete records using the Spark Cassandra connector? Or do you need to use the Java driver and execute "DELETE" statements directly?
--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.
Seconded. I usually have to create a loop within an rdd and make raw connector calls.
Why we dont have a function supporting tombstone? My application demands a function like delete each items in a RDD from C*. If only delete the non-primary columns, I have to do lots of filters when load back. Can it be nicer to have a function which can fully delete the row?
Why we dont have a function supporting tombstone? My application demands a function like delete each items in a RDD from C*. If only delete the non-primary columns, I have to do lots of filters when load back. Can it be nicer to have a function which can fully delete the row?
Yes on my end.
--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.
That would be nice and flexible.
CassandraJavaUtil.javaFunctions(theRDD)
.writerBuilder(keyspace, table, writer_factory)
.withConstantTTL(1)
.saveToCassandra();
I am trying the option of raw delete of each row by looping through an RDD using JAVA. code snippet is below:
transient static CassandraConnector connector = null;
transient static Session session;
transient static PreparedStatement pstmt;
transient static BoundStatement delBoundStmt;
private static void deleteData(JavaStreamingContext jssc,JavaDStream<DataEvent> dataEvents) {
connector = CassandraConnector.apply(jssc.sparkContext().getConf());
dataEvents.map(new Function<DataEvent,Void>() {
@Override
public Void call(DataEvent dataEvent) throws Exception {
session = connector.openSession();
pstmt = session.prepare("DELETE FROM temp.DataEvent where id = '?' and seq = ? and tid = ? and dtm = ?");
delBoundStmt = new BoundStatement(pstmt).bind(dataEvent.getId(),dataEvent.getSeq(),dataEvent.getTid(),dataEvent.getDtm());
session.execute(delBoundStmt);
session.close();
return null;
}
});
}
But records are not getting deleted ...
Can you please help with this.
Thanks
Vinayak
> To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-user+unsub...@lists.datastax.com.
--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-user+unsub...@lists.datastax.com.
> To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.
--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.
--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.