Unresolved dependencies: com.datastax.spark#spark-cassandra-connector_2.11;2.0.1-s_2.11: not found

1,502 views
Skip to first unread message

Mario Grgic

unread,
Jun 9, 2017, 7:40:58 PM6/9/17
to DataStax Spark Connector for Apache Cassandra
I'm following instructions on how to use Spark Cassandra connector and no matter what I do I always get unresolved dependencies errors below. I am on Mac OS X 10.12.5, Spark version 2.1.1, using scala version 2.11.8, JDK 1.8.0_51.

$ pyspark --packages com.datastax.spark:spark-cassandra-connector_2.11:2.0.1-s_2.11
Python 2.7.10 (default, Feb 7 2017, 00:08:15)
Type "copyright", "credits" or "license" for more information.

IPython 5.4.1 -- An enhanced Interactive Python.
? -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help -> Python's own help system.
object? -> Details about 'object', use 'object??' for extra details.
Ivy Default Cache set to: ~/.ivy2/cache
The jars for the packages stored in: ~/.ivy2/jars
:: loading settings :: url = jar:file:/Volumes/DATA/dev/SMACK/spark/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
com.datastax.spark#spark-cassandra-connector_2.11 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
confs: [default]
:: resolution report :: resolve 945ms :: artifacts dl 0ms
:: modules in use:
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 1 | 0 | 0 | 0 || 0 | 0 |
---------------------------------------------------------------------

:: problems summary ::
:::: WARNINGS
module not found: com.datastax.spark#spark-cassandra-connector_2.11;2.0.1-s_2.11

==== local-m2-cache: tried

file:~/.m2/repository/com/datastax/spark/spark-cassandra-connector_2.11/2.0.1-s_2.11/spark-cassandra-connector_2.11-2.0.1-s_2.11.pom

-- artifact com.datastax.spark#spark-cassandra-connector_2.11;2.0.1-s_2.11!spark-cassandra-connector_2.11.jar:

file:~/.m2/repository/com/datastax/spark/spark-cassandra-connector_2.11/2.0.1-s_2.11/spark-cassandra-connector_2.11-2.0.1-s_2.11.jar

==== local-ivy-cache: tried

~/.ivy2/local/com.datastax.spark/spark-cassandra-connector_2.11/2.0.1-s_2.11/ivys/ivy.xml

-- artifact com.datastax.spark#spark-cassandra-connector_2.11;2.0.1-s_2.11!spark-cassandra-connector_2.11.jar:

~/.ivy2/local/com.datastax.spark/spark-cassandra-connector_2.11/2.0.1-s_2.11/jars/spark-cassandra-connector_2.11.jar

==== central: tried

https://repo1.maven.org/maven2/com/datastax/spark/spark-cassandra-connector_2.11/2.0.1-s_2.11/spark-cassandra-connector_2.11-2.0.1-s_2.11.pom

-- artifact com.datastax.spark#spark-cassandra-connector_2.11;2.0.1-s_2.11!spark-cassandra-connector_2.11.jar:

https://repo1.maven.org/maven2/com/datastax/spark/spark-cassandra-connector_2.11/2.0.1-s_2.11/spark-cassandra-connector_2.11-2.0.1-s_2.11.jar

==== spark-packages: tried

http://dl.bintray.com/spark-packages/maven/com/datastax/spark/spark-cassandra-connector_2.11/2.0.1-s_2.11/spark-cassandra-connector_2.11-2.0.1-s_2.11.pom

-- artifact com.datastax.spark#spark-cassandra-connector_2.11;2.0.1-s_2.11!spark-cassandra-connector_2.11.jar:

http://dl.bintray.com/spark-packages/maven/com/datastax/spark/spark-cassandra-connector_2.11/2.0.1-s_2.11/spark-cassandra-connector_2.11-2.0.1-s_2.11.jar

::::::::::::::::::::::::::::::::::::::::::::::

:: UNRESOLVED DEPENDENCIES ::

::::::::::::::::::::::::::::::::::::::::::::::

:: com.datastax.spark#spark-cassandra-connector_2.11;2.0.1-s_2.11: not found

::::::::::::::::::::::::::::::::::::::::::::::

:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: com.datastax.spark#spark-cassandra-connector_2.11;2.0.1-s_2.11: not found]
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1083)
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:296)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:160)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
[TerminalIPythonApp] WARNING | Unknown error in handling PYTHONSTARTUP file /Volumes/DATA/dev/SMACK//spark/python/pyspark/shell.py:
---------------------------------------------------------------------------
Exception Traceback (most recent call last)
/Library/Python/2.7/site-packages/IPython/core/shellapp.pyc in _exec_file(self, fname, shell_futures)
326 self.shell.user_ns,
327 shell_futures=shell_futures,
--> 328 raise_exceptions=True)
329 finally:
330 sys.argv = save_argv

/Library/Python/2.7/site-packages/IPython/core/interactiveshell.pyc in safe_execfile(self, fname, *where, **kw)
2481 py3compat.execfile(
2482 fname, glob, loc,
-> 2483 self.compile if kw['shell_futures'] else None)
2484 except SystemExit as status:
2485 # If the call was made with 0 or None exit status (sys.exit(0)

/Library/Python/2.7/site-packages/IPython/utils/py3compat.pyc in execfile(fname, glob, loc, compiler)
287 where = [ns for ns in [glob, loc] if ns is not None]
288 if compiler is None:
--> 289 builtin_mod.execfile(filename, *where)
290 else:
291 scripttext = builtin_mod.open(fname).read().rstrip() + '\n'

/Volumes/DATA/dev/SMACK/spark/python/pyspark/shell.py in <module>()
36 SparkContext.setSystemProperty("spark.executor.uri", os.environ["SPARK_EXECUTOR_URI"])
37
---> 38 SparkContext._ensure_initialized()
39
40 try:

/Volumes/DATA/dev/SMACK/spark/python/pyspark/context.pyc in _ensure_initialized(cls, instance, gateway, conf)
257 with SparkContext._lock:
258 if not SparkContext._gateway:
--> 259 SparkContext._gateway = gateway or launch_gateway(conf)
260 SparkContext._jvm = SparkContext._gateway.jvm
261

/Volumes/DATA/dev/SMACK/spark/python/pyspark/java_gateway.pyc in launch_gateway(conf)
93 callback_socket.close()
94 if gateway_port is None:
---> 95 raise Exception("Java gateway process exited before sending the driver its port number")
96
97 # In Windows, ensure the Java child processes do not linger after Python has exited.

Exception: Java gateway process exited before sending the driver its port number

The only way I can get this sort of going is if I compile the connector from source, and then pass it in --jars option.

However, in this case I get missing classes when the spark job is submitted.

What am I doing wrong?

Russell Spitzer

unread,
Jun 9, 2017, 7:52:58 PM6/9/17
to DataStax Spark Connector for Apache Cassandra
You are using the wrong identifier.

Please follow the instructions here
https://spark-packages.org/package/datastax/spark-cassandra-connector

Or if you want to use the maven artifact
https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector_2.10/2.0.2

But the string you used with packages is between the two, using some elements of one path and some of the other and ends up pointing at a non-existant space on maven

--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.
--

Russell Spitzer
Software Engineer




DS_Sig2.png

Russell Spitzer

unread,
Jun 9, 2017, 7:54:21 PM6/9/17
to DataStax Spark Connector for Apache Cassandra

Mario Grgic

unread,
Jun 9, 2017, 8:07:40 PM6/9/17
to DataStax Spark Connector for Apache Cassandra
Thank you. I am following instructions from the cassandra connector github page:

https://github.com/datastax/spark-cassandra-connector/blob/master/doc/15_python.md

In any case, starting pyspark (or spark-shell) with:

spark-shell --packages mysql:mysql-connector-java:5.1.38,datastax:spark-cassandra-connector:2.0.1-s_2.11 --conf spark.cassandra.connection.host=localhost

works, however I still get the following error:

:: problems summary ::
:::: ERRORS
unknown resolver null

even though packages are loaded and everything I tried so far works.

Reply all
Reply to author
Forward
0 new messages