change a mongo-spark connection configuration from a databricks python notebook

51 views
Skip to first unread message

Matteo Moci

unread,
Jun 24, 2016, 10:57:42 PM6/24/16
to mongod...@googlegroups.com
Hello Everyone, 
Just posted this question also on stack overflow [0], re posting it here in case anyone can help. 

I succeeded at connecting to mongodb from spark, using the mongo-spark connector [1] from a databricks notebook in python.

Right now I am configuring the mongodb uri in an environment variable, but it is not flexible, since I want to be able to change the connection parameter right in my notebook.

I read in the connector documentation [2] that it is possible to override any values set in the SparkConf.

How can I override the values from python?

Best, 
Matteo


--
Matteo Moci
http://mox.fm

Ross Lawley

unread,
Jun 26, 2016, 11:28:37 AM6/26/16
to mongodb-user
Hi,

I posted a full on Stackoverflow, but the short answer is you can supply any configuration option to the DataFrame reader or writer using the option method.

All the best,

Ross

Ross Lawley

unread,
Jun 26, 2016, 11:28:40 AM6/26/16
to mongodb-user
Hi,

I posted a full on Stackoverflow, but the short answer is you can supply any configuration option to the DataFrame reader or writer using the option method.

All the best,

Ross

On Saturday, June 25, 2016 at 3:57:42 AM UTC+1, Matteo Moci wrote:
Reply all
Reply to author
Forward
0 new messages