Hi Community,
we are currently upgrading the spark version from 2 to 3 along with Cassandra spark connector which was working fine on spark2, but facing issue with spark3 upgrades as session leak, we are using Spark RDD with CassandraJavaUtil to store the data using spark, but after the upgrade it is creating new connection for each batch causing connection overhead. please help solve this issue, i know dsreams solve the issue, but we don't have the flexibility to change the RDD's to Dstreams for CassandraStreamingJavaUtil. Thanks in advance.
Regards,
Surya Goli.