Maybe I understood something wrong, but are all the versions of the mongo-hadoop-spark dependent on spark-core_2.11?
Hi Rafael,
If you are referring to mongo-hadoop connector for Spark, as mentioned on the installation section there are only two dependencies for running MongoDB Hadoop Connector with Spark:
mongo-hadoop-spark.jar
mongo-java-driver.jar
Assuming you are using the latest Apache Spark, currently at v1.6.1 with default of Scala v2.10. You could add the .jar
files above to the class path. For example using the spark shell (Scala v2.10) you could specify:
./spark-1.6.1-bin-hadoop2.6/bin/spark-shell --jars mongo-java-driver-3.2.2.jar,mongo-hadoop-spark-1.5.2.jar
However if you are referring to the MongoDB Scala Driver, currently it only offers compatibility for Scala v2.11. Although the use of the Scala Driver is not required for setting up MongoDB Spark integrations.
Best regards,
Wan.