java.lang.NoSuchMethodError: com.stratio.datasource.mongodb.MongodbContext.fromMongoDB(Lcom/stratio/

81 views
Skip to first unread message

Rotem Gabay

unread,
Jan 23, 2017, 9:46:53 AM1/23/17
to mongodb-casbah-users
Hi guys,
I can figure why do I get the error :
[centos@i bin]$ ./spark-shell --jars ~/spark-mongodb_2.10-0.11.2.jar  --packages org.mongodb:casbah-core_2.10:3.0.0
Ivy Default Cache set to: /home/centos/.ivy2/cache
The jars for the packages stored in: /home/centos/.ivy2/jars
:: loading settings :: url = jar:file:/home/centos/spark/spark-2.0.2-bin-hadoop2.7/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
org.mongodb#casbah-core_2.10 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
        confs: [default]
        found org.mongodb#casbah-core_2.10;3.0.0 in central
        found org.mongodb#casbah-commons_2.10;3.0.0 in central
        found com.github.nscala-time#nscala-time_2.10;1.0.0 in central
        found joda-time#joda-time;2.3 in central
        found org.joda#joda-convert;1.2 in central
        found org.mongodb#mongo-java-driver;3.0.4 in central
        found org.slf4j#slf4j-api;1.6.0 in central
        found org.mongodb#casbah-query_2.10;3.0.0 in central
:: resolution report :: resolve 377ms :: artifacts dl 13ms
        :: modules in use:
        com.github.nscala-time#nscala-time_2.10;1.0.0 from central in [default]
        joda-time#joda-time;2.3 from central in [default]
        org.joda#joda-convert;1.2 from central in [default]
        org.mongodb#casbah-commons_2.10;3.0.0 from central in [default]
        org.mongodb#casbah-core_2.10;3.0.0 from central in [default]
        org.mongodb#casbah-query_2.10;3.0.0 from central in [default]
        org.mongodb#mongo-java-driver;3.0.4 from central in [default]
        org.slf4j#slf4j-api;1.6.0 from central in [default]
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   8   |   0   |   0   |   0   ||   8   |   0   |
        ---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent
        confs: [default]
        0 artifacts copied, 8 already retrieved (0kB/8ms)
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
17/01/23 14:17:53 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/01/23 14:17:55 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect.
Spark context Web UI available at http://172.31.19.188:4040
Spark context available as 'sc' (master = local[*], app id = local-1485181074885).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.0.2
      /_/

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_111)
Type in expressions to have them evaluated.
Type :help for more information.

scala> import org.apache.spark.sql._
import org.apache.spark.sql._

scala> import com.mongodb.casbah.{WriteConcern => MongodbWriteConcern}
import com.mongodb.casbah.{WriteConcern=>MongodbWriteConcern}

scala> import com.stratio.datasource.mongodb._
import com.stratio.datasource.mongodb._

scala> import com.stratio.datasource.mongodb.config._
import com.stratio.datasource.mongodb.config._

scala> import com.stratio.datasource.mongodb.config.MongodbConfig._
import com.stratio.datasource.mongodb.config.MongodbConfig._

scala> val builder = MongodbConfigBuilder(Map(Host -> List("localhost:27017"), Database -> "db1", Collection ->"coll1", SamplingRatio -> 0.001, WriteConcern -> "normal"))
builder: com.stratio.datasource.mongodb.config.MongodbConfigBuilder = MongodbConfigBuilder(Map(database -> diagnostics, writeConcern -> normal, schema_samplingRatio -> 0.001, collection -> SessionAudit, host -> List(localhost:27017)))


scala> val readConfig = builder.build()
readConfig: com.stratio.datasource.util.Config = com.stratio.datasource.util.ConfigBuilder$$anon$1@f3cee0fa



scala> val mongoRDD = spark.sqlContext.fromMongoDB(readConfig)
spark   spark_partition_id

scala> val mongoRDD = spark.sqlContext.fromMongoDB(readConfig)
java.lang.NoSuchMethodError: com.stratio.datasource.mongodb.MongodbContext.fromMongoDB(Lcom/stratio/datasource/util/Config;Lscala/Option;)Lorg/apache/spark/sql/Dataset;
  ... 56 elided

Reply all
Reply to author
Forward
0 new messages