Connector 2.0.0-M3 NullPointerException when creating Dataset with Spark-2.1.0

631 views
Skip to first unread message

Andrei Arion

unread,
Jan 17, 2017, 11:39:12 AM1/17/17
to DataStax Spark Connector for Apache Cassandra
Hello all,
when running the attached class (TestNPE) I have a NPE when reading from Cassandra:


net.courtanet.devtools.db.spark.TestNPE
Exception in thread "main" java.lang.NullPointerException
at com.datastax.driver.core.Cluster$Manager.close(Cluster.java:1575)
at com.datastax.driver.core.Cluster$Manager.access$200(Cluster.java:1283)
at com.datastax.driver.core.Cluster.closeAsync(Cluster.java:554)
at com.datastax.driver.core.Cluster.close(Cluster.java:566)
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:162)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$3.apply(CassandraConnector.scala:149)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$3.apply(CassandraConnector.scala:149)
at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)
at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:82)
at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:110)
at com.datastax.spark.connector.rdd.partitioner.dht.TokenFactory$.forSystemLocalPartitioner(TokenFactory.scala:98)
at org.apache.spark.sql.cassandra.CassandraSourceRelation$.apply(CassandraSourceRelation.scala:255)
at org.apache.spark.sql.cassandra.DefaultSource.createRelation(DefaultSource.scala:55)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:330)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:152)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125)
at net.courtanet.devtools.db.spark.TestNPE.main(TestNPE.java:45)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)


Here is my dependency list:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.11</artifactId>
<version>2.0.0-M3</version>
</dependency>


The same code runs fine with Spark-2.0.2 and the same version of connector.


Is there a workaround ? Is this a known problem or should I fill a bug (I have found no related bug on JIRA)

Thanks,
Andrei

TestNPE.java

Russell Spitzer

unread,
Jan 17, 2017, 2:21:29 PM1/17/17
to DataStax Spark Connector for Apache Cassandra
that's a new one for me, do you have any more details? Getting a NPE inside the driver makes me think it is possible a JDK version thing? But that is a really strange place to throw an error?

https://github.com/datastax/java-driver/blob/3.0.2/driver-core/src/main/java/com/datastax/driver/core/Cluster.java#L1575

--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.
--

Russell Spitzer
Software Engineer




DS_Sig2.png

Andrei Arion

unread,
Jan 18, 2017, 4:33:19 AM1/18/17
to spark-conn...@lists.datastax.com
I've tried with the latest JDK (8u121) as well with a somewhat older
version (8u111) with the same result.

Finally, I have tried to use the unshaded version of the plugin and
the problem just went away:

<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector-unshaded_2.11</artifactId>
<version>2.0.0-M3</version>
</dependency>

So it is probably "just a dependency problem" , I am just surprised
that it manifested it by the NPE.

I attach my whole maven dependency tree.
> Russell Spitzer
> Software Engineer
maven-dependency-tree.txt

Russell Spitzer

unread,
Jan 18, 2017, 11:52:52 AM1/18/17
to spark-conn...@lists.datastax.com
It could be the Guava Conflict no longer exists, did Spark 2.10 upgrade hadoop?

László Szép

unread,
Mar 21, 2017, 12:15:20 PM3/21/17
to DataStax Spark Connector for Apache Cassandra
Hi,

I have this issue too. I just would like to open a session:

CassandraConnector connector = CassandraConnector.apply(this.javaSparkContext.getConf());
Session session = connector.openSession();

and throw this exception:

java.lang.NullPointerException
at com.datastax.driver.core.Cluster$Manager.close(Cluster.java:1575)
at com.datastax.driver.core.Cluster$Manager.access$200(Cluster.java:1283)
at com.datastax.driver.core.Cluster.closeAsync(Cluster.java:554)
at com.datastax.driver.core.Cluster.close(Cluster.java:566)
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:167)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$8.apply(CassandraConnector.scala:154)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$8.apply(CassandraConnector.scala:154)
at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:32)
at com.datastax.spark.connector.cql.RefCountedCache.syncAcquire(RefCountedCache.scala:69)
at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:57)
at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:79)
at com.myproject.handlers.cassandra.CassandraHandler.createKeyspaceIfNotExists(CassandraHandler.java:70)
at com.myproject.handlers.cassandra.CassandraInitializer.checkCassandra(CassandraInitializer.java:11)
at com.myproject.handlers.cassandra.CassandraHandler.<init>(CassandraHandler.java:58)
at com.myproject.handlers.spark.SparkSessionInstance.initSession(SparkSessionInstance.java:78)
at com.myproject.handlers.spark.SparkSessionInstance.<init>(SparkSessionInstance.java:45)
at com.myproject.handlers.spark.SparkSessionInstance.<clinit>(SparkSessionInstance.java:14)
at com.myproject.restservice.Initializer.initSpark(Initializer.java:43)
at com.myproject.restservice.Initializer.init(Initializer.java:28)
at com.myproject.restservice.CTXListener.contextInitialized(CTXListener.java:20)
at org.eclipse.jetty.server.handler.ContextHandler.callContextInitialized(ContextHandler.java:843)
at org.eclipse.jetty.servlet.ServletContextHandler.callContextInitialized(ServletContextHandler.java:533)
at org.eclipse.jetty.server.handler.ContextHandler.startContext(ContextHandler.java:816)
at org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:345)
at org.eclipse.jetty.webapp.WebAppContext.startWebapp(WebAppContext.java:1404)
at org.eclipse.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1366)
at org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:778)
at org.eclipse.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:262)
at org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:520)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.deploy.bindings.StandardStarter.processBinding(StandardStarter.java:41)
at org.eclipse.jetty.deploy.AppLifeCycle.runBindings(AppLifeCycle.java:188)
at org.eclipse.jetty.deploy.DeploymentManager.requestAppGoal(DeploymentManager.java:499)
at org.eclipse.jetty.deploy.DeploymentManager.addApp(DeploymentManager.java:147)
at org.eclipse.jetty.deploy.providers.ScanningAppProvider.fileAdded(ScanningAppProvider.java:180)
at org.eclipse.jetty.deploy.providers.WebAppProvider.fileAdded(WebAppProvider.java:452)
at org.eclipse.jetty.deploy.providers.ScanningAppProvider$1.fileAdded(ScanningAppProvider.java:64)
at org.eclipse.jetty.util.Scanner.reportAddition(Scanner.java:610)
at org.eclipse.jetty.util.Scanner.reportDifferences(Scanner.java:529)
at org.eclipse.jetty.util.Scanner.scan(Scanner.java:392)
at org.eclipse.jetty.util.Scanner.doStart(Scanner.java:313)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.deploy.providers.ScanningAppProvider.doStart(ScanningAppProvider.java:150)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.deploy.DeploymentManager.startAppProvider(DeploymentManager.java:561)
at org.eclipse.jetty.deploy.DeploymentManager.doStart(DeploymentManager.java:236)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:131)
at org.eclipse.jetty.server.Server.start(Server.java:422)
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:113)
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61)
at org.eclipse.jetty.server.Server.doStart(Server.java:389)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.xml.XmlConfiguration$1.run(XmlConfiguration.java:1516)
at java.security.AccessController.doPrivileged(Native Method)
at org.eclipse.jetty.xml.XmlConfiguration.main(XmlConfiguration.java:1441)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.eclipse.jetty.start.Main.invokeMain(Main.java:214)
at org.eclipse.jetty.start.Main.start(Main.java:457)
at org.eclipse.jetty.start.Main.main(Main.java:75)
java.lang.NullPointerException
at com.datastax.driver.core.Cluster$Manager.close(Cluster.java:1575)
at com.datastax.driver.core.Cluster$Manager.access$200(Cluster.java:1283)
at com.datastax.driver.core.Cluster.closeAsync(Cluster.java:554)
at com.datastax.driver.core.Cluster.close(Cluster.java:566)
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:167)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$8.apply(CassandraConnector.scala:154)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$8.apply(CassandraConnector.scala:154)
at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:32)
at com.datastax.spark.connector.cql.RefCountedCache.syncAcquire(RefCountedCache.scala:69)
at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:57)
at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:79)
at com.myproject.handlers.cassandra.CassandraHandler.createTableIfNotExists(CassandraHandler.java:79)
at com.myproject.handlers.cassandra.CassandraInitializer.checkCassandra(CassandraInitializer.java:14)
at com.myproject.handlers.cassandra.CassandraHandler.<init>(CassandraHandler.java:58)
at com.myproject.handlers.spark.SparkSessionInstance.initSession(SparkSessionInstance.java:78)
at com.myproject.handlers.spark.SparkSessionInstance.<init>(SparkSessionInstance.java:45)
at com.myproject.handlers.spark.SparkSessionInstance.<clinit>(SparkSessionInstance.java:14)
at com.myproject.restservice.Initializer.initSpark(Initializer.java:43)
at com.myproject.restservice.Initializer.init(Initializer.java:28)
at com.myproject.restservice.CTXListener.contextInitialized(CTXListener.java:20)
at org.eclipse.jetty.server.handler.ContextHandler.callContextInitialized(ContextHandler.java:843)
at org.eclipse.jetty.servlet.ServletContextHandler.callContextInitialized(ServletContextHandler.java:533)
at org.eclipse.jetty.server.handler.ContextHandler.startContext(ContextHandler.java:816)
at org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:345)
at org.eclipse.jetty.webapp.WebAppContext.startWebapp(WebAppContext.java:1404)
at org.eclipse.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1366)
at org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:778)
at org.eclipse.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:262)
at org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:520)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.deploy.bindings.StandardStarter.processBinding(StandardStarter.java:41)
at org.eclipse.jetty.deploy.AppLifeCycle.runBindings(AppLifeCycle.java:188)
at org.eclipse.jetty.deploy.DeploymentManager.requestAppGoal(DeploymentManager.java:499)
at org.eclipse.jetty.deploy.DeploymentManager.addApp(DeploymentManager.java:147)
at org.eclipse.jetty.deploy.providers.ScanningAppProvider.fileAdded(ScanningAppProvider.java:180)
at org.eclipse.jetty.deploy.providers.WebAppProvider.fileAdded(WebAppProvider.java:452)
at org.eclipse.jetty.deploy.providers.ScanningAppProvider$1.fileAdded(ScanningAppProvider.java:64)
at org.eclipse.jetty.util.Scanner.reportAddition(Scanner.java:610)
at org.eclipse.jetty.util.Scanner.reportDifferences(Scanner.java:529)
at org.eclipse.jetty.util.Scanner.scan(Scanner.java:392)
at org.eclipse.jetty.util.Scanner.doStart(Scanner.java:313)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.deploy.providers.ScanningAppProvider.doStart(ScanningAppProvider.java:150)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.deploy.DeploymentManager.startAppProvider(DeploymentManager.java:561)
at org.eclipse.jetty.deploy.DeploymentManager.doStart(DeploymentManager.java:236)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:131)
at org.eclipse.jetty.server.Server.start(Server.java:422)
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:113)
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61)
at org.eclipse.jetty.server.Server.doStart(Server.java:389)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.xml.XmlConfiguration$1.run(XmlConfiguration.java:1516)
at java.security.AccessController.doPrivileged(Native Method)
at org.eclipse.jetty.xml.XmlConfiguration.main(XmlConfiguration.java:1441)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.eclipse.jetty.start.Main.invokeMain(Main.java:214)
at org.eclipse.jetty.start.Main.start(Main.java:457)
at org.eclipse.jetty.start.Main.main(Main.java:75)

It works fine with spark version 2.0.1, but not with 2.1.0

Moreover, it works with spark 2.1.0 too, if I running my spark-client app on windows, but not work if it runs on linux (in docker container).

Have you got any idea?

Russell Spitzer

unread,
Mar 21, 2017, 12:19:59 PM3/21/17
to DataStax Spark Connector for Apache Cassandra

Same issue I think try doing manual shading or using the unshaded artifact like it says in the faq

László Szép

unread,
Mar 22, 2017, 4:06:22 AM3/22/17
to DataStax Spark Connector for Apache Cassandra
I tried, but not working.
> Russell Spitzer
> Software Engineer

sachin sharma

unread,
Mar 24, 2017, 5:11:04 AM3/24/17
to spark-conn...@lists.datastax.com
Hi,

I faced same kind of issue in 1.6.3, this NPException is misleading, details are as follows

I looked at com.datastax.spark.connector.cql.CassandraConnector.scala:-
 -- cluster.connect() throws following error:-
       java.lang.LinkageError: loader constraint violation: loader (instance of org/eclipse/osgi/internal/baseadaptor/DefaultClassLoader) previously initiated loading for a different type with name "javax/management/MBeanServer.
-- This linkage error is never visible, because logging is done after cluster.close()

  1. private def createSession(conf: CassandraConnectorConf): Session = {
  2.  lazy val endpointsStr = conf.hosts.map(_.getHostAddress).mkString("{", ", ", "}") + ":" + conf.port
  3. logDebug(s"Attempting to open native connection to Cassandra at $endpointsStr")
  4.  val cluster = conf.connectionFactory.createCluster(conf)
  5.  try {
  6.    val clusterName = cluster.getMetadata.getClusterName
  7.    logInfo(s"Connected to Cassandra cluster: $clusterName")
  8.    cluster.connect()
  9.  }
  10.  catch {
  11.   case e: Throwable =>
  12.     cluster.close()
  13.     throw new IOException(s"Failed to open native connection to Cassandra at $endpointsStr", e)
  14.  }
  15. }

I am able to resolve the issue by removing javax.management from spark-assembly-1.6.3-hadoop2.6.0.jar.


>
> >
>
> > >
>
> >
>
> > > --
>
> >
>
> > >
>
> >
>
> > > Russell Spitzer
>
> >
>
> > > Software Engineer
>
> >
>
> > >
>
> >
>
> > >
>
> >
>
> > >
>
> >
>
> > >
>
> >
>
> > >
>
> >
>
> > > --
>
> >
>
> > > You received this message because you are subscribed to the Google Groups
>
> >
>
> > > "DataStax Spark Connector for Apache Cassandra" group.
>
> >
>
> > > To unsubscribe from this group and stop receiving emails from it, send an
>
> >
>

>
> >
>
> >
>
> >
>
> > --
>
> >
>
> > You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
>
> >
>
> > To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-user+unsub...@lists.datastax.com.

>
>
>
> --
>
> You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
>
> To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-user+unsub...@lists.datastax.com.

>
>
> --
>
>
>
>
> Russell Spitzer
> Software Engineer

--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-user+unsub...@lists.datastax.com.

Russell Spitzer

unread,
Mar 24, 2017, 12:46:31 PM3/24/17
to spark-conn...@lists.datastax.com
So this is a hadoop 2.6 compatibility problem?


>
> >
>
> > >
>
> >
>
> > > --
>
> >
>
> > >
>
> >
>
> > > Russell Spitzer
>
> >
>
> > > Software Engineer
>
> >
>
> > >
>
> >
>
> > >
>
> >
>
> > >
>
> >
>
> > >
>
> >
>
> > >
>
> >
>
> > > --
>
> >
>
> > > You received this message because you are subscribed to the Google Groups
>
> >
>
> > > "DataStax Spark Connector for Apache Cassandra" group.
>
> >
>
> > > To unsubscribe from this group and stop receiving emails from it, send an
>
> >
>

>
> >
>
> >
>
> >
>
> > --
>
> >
>
> > You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
>
> >
>
> > To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.

>
>
>
> --
>
> You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
>
> To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.

>
>
> --
>
>
>
>
> Russell Spitzer
> Software Engineer

--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.

--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.

sachin sharma

unread,
Mar 30, 2017, 2:52:58 AM3/30/17
to spark-conn...@lists.datastax.com
This problem occurred when I was using spark-assembly as maven dependency from an OSGI bundle. The OSGi classloader is loading the javax.management again.




>
> >
>
> > >
>
> >
>
> > > --
>
> >
>
> > >
>
> >
>
> > > Russell Spitzer
>
> >
>
> > > Software Engineer
>
> >
>
> > >
>
> >
>
> > >
>
> >
>
> > >
>
> >
>
> > >
>
> >
>
> > >
>
> >
>
> > > --
>
> >
>
> > > You received this message because you are subscribed to the Google Groups
>
> >
>
> > > "DataStax Spark Connector for Apache Cassandra" group.
>
> >
>
> > > To unsubscribe from this group and stop receiving emails from it, send an
>
> >
>

>
> >
>
> >
>
> >
>
> > --
>
> >
>
> > You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
>
> >
>
> > To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-user+unsub...@lists.datastax.com.

>
>
>
> --
>
> You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
>
> To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-user+unsub...@lists.datastax.com.

>
>
> --
>
>
>
>
> Russell Spitzer
> Software Engineer

--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-user+unsub...@lists.datastax.com.

--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-user+unsub...@lists.datastax.com.

--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-user+unsub...@lists.datastax.com.

Reply all
Reply to author
Forward
0 new messages