Hi dennis,
Thank you for your reply.
I didn't change any property. on dataproc.
I launched new cluster and ran spark-shell command from master node and verified. But i see executor also running along with driver on master node.
i see Master assigned to yarn, please find the below output. Can you lead me how can i set up that master node only holds driver program.
Spark context available as 'sc' (master = yarn, app id = application_1583671393942_0002).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.3.4
/_/
Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_242)
Type in expressions to have them evaluated.
Type :help for more information.
scala>