setting SPARK_JAVA_OPTS spark-env.sh not working as expected

665 views
Skip to first unread message

Hussam Jarada

unread,
Oct 23, 2013, 3:47:10 PM10/23/13
to spark...@googlegroups.com
using spark 0.8.0

I have this line in my spark-env.sh

SPARK_JAVA_OPTS+=" -Dspark.local.dir=/tmp/spark  -XX:+UseParallelGC -XX:+UseParallelOldGC -XX:+DisableExplicitGC -Xms1024m -Xmx2048m -XX:MaxPermSize=256m"
export SPARK_JAVA_OPTS

logs shows it used but also -Xms512m -Xmx512m, how I can remove any reference to use -Xms512m -Xmx512m when I am setting explicitly  -Xms1024m -Xmx2048m in my spark-env.sh ?

Spark Command: /opt/jdk1.7.0_40/bin/java -cp :/opt/spark-0.8.0/conf:/opt/spark-0.8.0/assembly/target/scala-2.9.3/spark-assembly_2.9.3-0.8.0-incubating-hadoop1.0.4.jar -Dspark.local.dir=/tmp/spark -XX:+UseParallelGC -XX:+UseParallelOldGC -XX:+DisableExplicitGC -Xms1024m -Xmx2048m -XX:MaxPermSize=256m -Dspark.local.dir=/tmp/spark -XX:+UseParallelGC -XX:+UseParallelOldGC -XX:+DisableExplicitGC -Xms1024m -Xmx2048m -XX:MaxPermSize=256m -Dspark.local.dir=/tmp/spark -XX:+UseParallelGC -XX:+UseParallelOldGC -XX:+DisableExplicitGC -Xms1024m -Xmx2048m -XX:MaxPermSize=256m -Djava.library.path= -Xms512m -Xmx512m org.apache.spark.deploy.master.Master --ip poc1.stc1lab.local --port 7077 --webui-port 8080

========================================

if I set in my spark-env.sh
export SPARK_JAVA_OPTS=" -Dspark.local.dir=/tmp/spark  -XX:+UseParallelGC -XX:+UseParallelOldGC -XX:+DisableExplicitGC -Xms1024m -Xmx2048m -XX:MaxPermSize=256m"

logs shows:

Spark Command: /opt/jdk1.7.0_40/bin/java -cp :/opt/spark-0.8.0/conf:/opt/spark-0.8.0/assembly/target/scala-2.9.3/spark-assembly_2.9.3-0.8.0-incubating-hadoop1.0.4.jar -Djava.library.path= -Xms512m -Xmx512m org.apache.spark.deploy.master.Master --ip poc1.stc1lab.local --port 7077 --webui-port 8080
========================================

can someone please update on the correct way of setting SPARK_JAVA_OPTS in spark-env.sh?

Thanks,

Jey Kottalam

unread,
Oct 23, 2013, 6:51:14 PM10/23/13
to spark...@googlegroups.com
Hi Hussam,

Perhaps you want to set the "spark.executor.memory" system
property[1], or the SPARK_DAEMON_MEMORY environment variable[2]?

References:
1. https://spark.incubator.apache.org/docs/latest/configuration.html#system-properties
2. https://spark.incubator.apache.org/docs/latest/spark-standalone.html#cluster-launch-scripts

-Jey
> --
> You received this message because you are subscribed to the Google Groups
> "Spark Users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to spark-users...@googlegroups.com.
> For more options, visit https://groups.google.com/groups/opt_out.
Reply all
Reply to author
Forward
0 new messages