I'm using
Mongo Spark Connector, and I couldn't do insert operation to an existing collection.
with error message:
Exception in thread "main" java.lang.UnsupportedOperationException: MongoCollection already exists
at com.mongodb.spark.sql.DefaultSource.createRelation(DefaultSource.scala:99)
at com.mongodb.spark.sql.DefaultSource.createRelation(DefaultSource.scala:38)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:222)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:148)
...
My code of insert:
DataFrame centenarians = SparkSingletonUtils.sqlContext.sql("SELECT * FROM temp_table_name");
MongoSpark.write(centenarians).option("collection", "hundredClub").save(); // insert is not working
said that I should use MongoWriter class. But I couldn't found it.
Is the class exist? or should i use different method to insert new data?