scala spark-shell not opening in current directory

512 views
Skip to first unread message

taylor.sansom

unread,
Jan 27, 2017, 4:02:14 PM1/27/17
to scala...@googlegroups.com
Howdy all, I'm new to scala and spark but not new to programming and I've
having some problems that I have never encountered before and don't
understand. I want to start my spark-shell in a specific directory but when
I navigate to the folder, run spark-shell, and check the user.dir it always
says C:\... I'm taking some online courses to learn scala and spark but
nobody seems to have this problem. Summary below:

/ PS C:\Spark\My_Programs> cd .\Spark_DataFrames
PS C:\Spark\My_Programs\Spark_DataFrames> spark-shell
...
loading text removed
...
scala> System.getProperty("user.dir")
res0: String = C:\/

If I try to load the file df.scala (which lives in
C:\Spark\My_Programs\Spark_DataFrames) it says the file does not exist. I
have to specify the full path of the file to load it:

/scala> :load df.scala
That file does not exist

scala> :load \Spark\My_Programs\Spark_DataFrames\df.scala
Loading \Spark\My_Programs\Spark_DataFrames\df.scala...
I'm working!!!/

I assumed that I could just change the user.dir variable to the correct path
then load the file but I get the following:

/scala> System.setProperty("user.dir",
"C:\\Spark\\My_Programs\\Spark_DataFrames\\")
res1: String = C:\

scala> System.getProperty("user.dir")
res2: String = C:\Spark\My_Programs\Spark_DataFrames\

scala> :load df.scala
res3: That file does not exist/

But I can still give the full path to the file and it will run fine (which
tells me I'm still in C:\):

/scala> :load \Spark\My_Programs\Spark_DataFrames\df.scala
Loading \Spark\My_Programs\Spark_DataFrames\df.scala...
I'm working!!!/

I would really love to be able to run the spark shell in my current
directory (like I can on my mac and linux systems) so I don't have to
specify the full path to the file every time. Anyone have suggestions on why
this is happening and how I can remedy it? Many thanks - Taylor



--
View this message in context: http://scala-language.1934581.n4.nabble.com/scala-spark-shell-not-opening-in-current-directory-tp4648055.html
Sent from the Scala - User mailing list archive at Nabble.com.

Oliver Ruebenacker

unread,
Jan 27, 2017, 4:26:41 PM1/27/17
to taylor.sansom, scala-user

     Hello,

  The current working directory (cwd) at the time the JVM starts will be the cwd for that JVM for the rest of its life. It cannot be changed by changing System properties.

  spark-shell is a script that sets the cwd before it calls the JVM. Configure or edit the spark-shell script to run with a different cwd.

     Best, Oliver


--
You received this message because you are subscribed to the Google Groups "scala-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-user+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.



--
Oliver Ruebenacker
Senior Software Engineer, Diabetes Portal, Broad Institute

Reply all
Reply to author
Forward
Message has been deleted
0 new messages