To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.
please check few things:
- do you have spark.cassandra.output.batch.size.rows also set in your configuration? can you check for that sc.getConf.get("spark.cassandra.output.batch.size.rows") ? - if so, remove this setting
- this problem may also occur if a single row is bigger than 8kb - what are you saving - is it possible that you exceeds 8kb in a single row?
Hi Jacek,
I am not setting spark.cassandra.output.batch.size.rows. I am only setting spark.cassandra.output.batch.size.bytes to 8KB. The property appears in environment tab in Spark UI correctly, but this limit is not being considered by Spark while limiting, which is why I am seeing Batch too Long error and my app ends.
Also, my single row is just in bytes 300-400 bytes. Not beyond that.
Thanks
Hema
--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.
JACEK LEWANDOWSKI
SOFTWARE ENGINEER | jlewan...@datastax.com