spark on kubernetes

277 views
Skip to first unread message

R Rao

unread,
Sep 14, 2018, 11:27:36 AM9/14/18
to Kubernetes user discussion and Q&A
hi guys ,
   trying to figure out how to run spark job that talks to my hbase.
I do not want to bake/hardcode the hbase config into the driver or executor images .  I want  the configuration to be available via a configmap.
Can anybody please help , am new to this .
Thanks



Yinan Li

unread,
Sep 15, 2018, 1:56:17 AM9/15/18
to Kubernetes user discussion and Q&A
Spark on Kubernetes doesn't yet support mounting ConfigMaps. Not very familiar with how HBase is configured. Is it using the Hadoop configuration system? If so, you can use Spark configuration properties with the prefix "spark.hadoop.*" to set Hadoop config options. Spark automatically removes that prefix when using the options. Otherwise if it doesn't use the Hadoop configuration system, you can use the Spark Operator, which supports mounting ConfigMaps through a mutating admission webhook. See the documentation at https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/master/docs/user-guide.md#mounting-configmaps.
Reply all
Reply to author
Forward
0 new messages