Installation error on windows while executing sparkling-shell --conf "spark.executor.memory=1g"

134 views
Skip to first unread message

sunilchi...@gmail.com

unread,
Jun 21, 2016, 11:17:45 AM6/21/16
to H2O Open Source Scalable Machine Learning - h2ostream, Sunilkumar....@fmr.com
I am trying to install sparkling water on windows 10. I followed the instructions and installed spark 1.6.1 and set system environment variable to the spark directory. Similarly I have created MASTER environment variable and have set it to local-cluster[3,2,1024].

When I run the command sparkling-shell --conf "spark.executor.memory=1g" from sparkling water bin directory it comes up with this error -

You are trying to use Sparkling Water built for Spark , but your %SPARK_HOME(=C:
\spark140) property points to Spark of version 1.4.0. Please ensure correct Spar
k is provided and re-run Sparkling Water.

I tried installing multiple versions of spark and point the SPARK_HOME env variable to 1.5 and 1.6 as well but ended up getting the same error.

Is there something wrong that I am doing? Are there any special instructions for windows installation of sparkling water?

Finally , is setting the system env variables the corresponding steps for -

export SPARK_HOME="/path/to/spark/installation"

# To launch a local Spark cluster with 3 worker nodes with 2 cores and 1g per node.

export MASTER="local-cluster[3,2,1024]"

Thanks for your help and time.

mat...@0xdata.com

unread,
Jun 22, 2016, 4:04:42 AM6/22/16
to H2O Open Source Scalable Machine Learning - h2ostream, Sunilkumar....@fmr.com, sunilchi...@gmail.com
Hey,

There are no additional steps for Windows. You just need to change your SPARK_HOME env variable since now it's pointing to C:\spark140. You have to make it point to a never Spark version appropriate for the SW version you're using.

What do you see when you run this in windows command prompt: echo %SPARK_HOME%?

You can check your env variable by following:

1) System (Control Panel)
2) Advanced system settings
3) Environment Variables.
4) Find and change SPARK_HOME to point to Spark 1.6
5) Restart your prompt

Mateusz

Reply all
Reply to author
Forward
0 new messages