Is there a mongo-hadoop-spark built on Scala 2.10?

31 views
Skip to first unread message

Rafael Aguiar

unread,
Apr 27, 2016, 12:38:23 PM4/27/16
to mongodb-user
Maybe I understood something wrong, but are all the versions of the mongo-hadoop-spark dependent on spark-core_2.11?
Therefore, all dependent on Scala 2.11?


Wan Bachtiar

unread,
May 2, 2016, 8:40:25 PM5/2/16
to mongodb-user

Maybe I understood something wrong, but are all the versions of the mongo-hadoop-spark dependent on spark-core_2.11?

Hi Rafael,
If you are referring to mongo-hadoop connector for Spark, as mentioned on the installation section there are only two dependencies for running MongoDB Hadoop Connector with Spark:

  • The ‘spark’ jar, which is called mongo-hadoop-spark.jar
  • MongoDB Java Driver ‘uber’ jar called mongo-java-driver.jar

Assuming you are using the latest Apache Spark, currently at v1.6.1 with default of Scala v2.10. You could add the .jar files above to the class path. For example using the spark shell (Scala v2.10) you could specify:

./spark-1.6.1-bin-hadoop2.6/bin/spark-shell --jars mongo-java-driver-3.2.2.jar,mongo-hadoop-spark-1.5.2.jar

However if you are referring to the MongoDB Scala Driver, currently it only offers compatibility for Scala v2.11. Although the use of the Scala Driver is not required for setting up MongoDB Spark integrations.

Best regards,

Wan.

Reply all
Reply to author
Forward
0 new messages