How to use spylon-kernel (scala) in Jupyter Notebook of dataproc to connect to Hive metastore?

346 views
Skip to first unread message

Desiree Zhu

unread,
Feb 10, 2020, 12:26:15 PM2/10/20
to Google Cloud Dataproc Discussions
I install Jupyter Notebook in my dataproc, but how can I make to connect to the Hive in another dataproc? I have put hive-site.xml in /usr/lib/spark/conf therefore spark-sql could connect there but not work with Jupyter Notebook, seem like the spark.sql(show databases) connect to local metastore instead of Hive one. Thanks!

karth...@google.com

unread,
Feb 10, 2020, 12:39:54 PM2/10/20
to Google Cloud Dataproc Discussions
Looking in /etc/spark/conf/spark-env.sh, Spark is already configured to read Hive configuration through /etc/hive/conf/*: `SPARK_DIST_CLASSPATH="$SPARK_DIST_CLASSPATH:/etc/hive/conf"`

You can configure Hive properties in Dataproc through --properties: https://cloud.google.com/dataproc/docs/concepts/configuring-clusters/cluster-properties.

So I suspect something like `gcloud dataproc clusters create --optional-components=ANACONDA,JUPYTER --properties hive:some.hive.property=some.value,hive:another.hive.property=another.value` will work.

Desiree Zhu

unread,
Feb 10, 2020, 1:26:45 PM2/10/20
to Google Cloud Dataproc Discussions
Thank you so much! I put the hive-site.xml under /etc/hive/conf. Now the Jupyter Notebook kernal 'PySpark' could connect to the hive, but 'spylon-kernal' not, there must be some configure for the kernal...but could not find it. 

hg...@google.com

unread,
Mar 4, 2020, 5:34:10 PM3/4/20
to Google Cloud Dataproc Discussions
Tested as much as I understand the scenarios here, spylon-kernal connects with remove Hive metastore and conduct read/write successfully after config spark-sql with spark.sql.catalogImplementation=hive by setting following properties when creating cluster:

--properties hive:hive.metastore.warehouse.dir=gs://<warehouse-path>,hive:hive.metastore.uris=thrift://<remote-hive-host>:9083,spark:spark.sql.catalogImplementation=hive

Reply all
Reply to author
Forward
0 new messages