Can anyone know if i need some additional configuration?
Hi Eduardo,
Could you provide the full trace output from your spark-shell ? Is there any error when resolving dependencies ? You should be seeing output similar to below:
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
confs: [default]
found org.mongodb.spark#mongo-spark-connector_2.10;1.1.0 in central
found org.mongodb#mongo-java-driver;3.2.2 in central
downloading https://repo1.maven.org/maven2/org/mongodb/spark/mongo-spark-connector_2.10/1.1.0/mongo-spark-connector_2.10-1.1.0.jar ...
[SUCCESSFUL ] org.mongodb.spark#mongo-spark-connector_2.10;1.1.0!mongo-spark-connector_2.10.jar (2310ms)
downloading https://repo1.maven.org/maven2/org/mongodb/mongo-java-driver/3.2.2/mongo-java-driver-3.2.2.jar ...
[SUCCESSFUL ] org.mongodb#mongo-java-driver;3.2.2!mongo-java-driver.jar (3585ms)
:: resolution report :: resolve 14468ms :: artifacts dl 5909ms
--jars /home/rtp/bson-3.2.2.jar
Could you also try to remove this bson-3.2.2.jar ?
The mongo-java-driver jar (dependency of the mongo-spark-connector) is an all-in-one jar that includes the core driver and the BSON. It is likely that the namespace collides which then failed to resolve.
Regards,
Wan