num executors in spark (standalone mode)

420 views
Skip to first unread message

Debasish Kanhar

unread,
Jul 27, 2018, 7:23:03 AM7/27/18
to JanusGraph users
Hi.

Is there way to specify number of executors I want while doing any OLAP using JanusGraph 2.1 ?

I've tried following, but in all cases, my OLAP job takes up all the executos available.

spark.num.executors=5 [Doesn't work]
spark.executor.instances=5 [Desnt work]
spark.extraJavaOptions=-Dnum-executors=5 [ Doesn't work].

Ideally we have a Spark Computing env setup with 12 executors, each with 7 cores, and 15gb each.

We want to use 5 executors, with each executor core 6, and each executor 15gb. How do I specify this?

I've been able to add and make following work:

spark.executor.memory=15g
spark
.executor.cores=6

How do I control number of executors or that is available only in yarn mode?

Thanks

Jason Plurad

unread,
Jul 27, 2018, 9:48:14 AM7/27/18
to JanusGraph users
I think it's only available in YARN

Debasish Kanhar

unread,
Aug 1, 2018, 2:41:45 AM8/1/18
to JanusGraph users
Thanks Jason, Yes you are right, it is available only in YARN mode.

I was still able to control the number of executors spawned in Standalone mode using combination of following properties:

spark.cores.max=
spark
.executor.cores=

So, acc to this,

num_executors = math.floow(spark.cores.max/spark.executor.cores)

The above logic gives us the number of executors, and we will need to tweek this param to control num of executors in Standalone mode.

Cheers.
Reply all
Reply to author
Forward
0 new messages