mongo hadoop 1.5.2 version to 2.0.2

31 views
Skip to first unread message

David Song

unread,
Jul 11, 2017, 5:49:53 AM7/11/17
to mongodb-user
Dear All,
I use Mongo Hadoop to store data in Spark directly. In 1.5.2 version I used like this(setObj is a BasicDBObject):
(null,new MongoUpdateWritable(        
            new BasicDBObject("_id", "12345"),  // Query
            new BasicDBObject("$set", setObj),  // Update operation
            //setObj,  // Update operation
            true,  // Upsert
            false   // Update multiple documents           
            ))

after update to version 2.0.2:
(null,new MongoUpdateWritable(        
            new BasicDBObject("_id", "12345"),  // Query
            new BasicDBObject("$set", setObj),  // Update operation
            //setObj,  // Update operation
            true,  // Upsert
            false   // Update multiple documents
            ,false  // replace
            ))

then some exception occurs:
java.lang.IllegalArgumentException: Invalid BSON field name $set

Doesn't it support set operation?  And another question I found that the field name can't contain ".", which is inherited document use.

thanks

Wan Bachtiar

unread,
Aug 10, 2017, 9:30:43 PM8/10/17
to mongodb-user

java.lang.IllegalArgumentException: Invalid BSON field name $set Doesn’t it support set operation?

Hi David,

It’s been a while since you posted your question, have you found the answer yet ?

Yes, it should still support $set operation. You should check the content of your variable setObj.
For example, below should work with mongo-hadoop v2.0.2, Spark v2.0, MongoDB Java driver v3.4.x

MongoUpdateWritable test = new MongoUpdateWritable(new BasicDBObject("_id", 123), 
                            new BasicDBObject("$set", new BasicDBObject("foo", "bar")), 
                            true, false, false
                            );

And another question I found that the field name can’t contain “.”, which is inherited document use.

I use Mongo Hadoop to store data in Spark directly.

If you’re utilising mongo-hadoop to read/write to Spark, I would recommend to check out MongoDB Spark Connector (Java) instead. See also Spark Java Guide: Write to MongoDB

Regards,
Wan.

Reply all
Reply to author
Forward
0 new messages