Null pointer exception when running ./run spark.examples.SparkPi local[1]

2,457 views
Skip to first unread message

Johnpaul Ci

unread,
Mar 8, 2013, 1:44:36 AM3/8/13
to spark...@googlegroups.com
I am encountering an error while running the spark pi program
I issued the command as follows ./run spark.examples.SparkPi local[1]
Shall I need to specify local or localhost
 
Exception in thread "main" java.lang.NullPointerException
      at java.net.URI$Parser.parse(URI.java:3023)
      at java.net.URI.<init>(URI.java:595)
      at spark.SparkContext.addJar(SparkContext.scala:511)
      at spark.SparkContext$$anonfun$2.apply(SparkContext.scala:102)
      at spark.SparkContext$$anonfun$2.apply(SparkContext.scala:102)
      at scala.collection.LinearSeqOptimized$class.foreach(LinearSeqOptimized.scala:59)
      at scala.collection.immutable.List.foreach(List.scala:45)
      at spark.SparkContext.<init>(SparkContext.scala:102)
      at spark.examples.SparkPi$.main(SparkPi.scala:13)
      at spark.examples.SparkPi.main(SparkPi.scala)
 
I would appreaciate any help regarding this issue.

Benoît Denis

unread,
Mar 8, 2013, 3:26:32 AM3/8/13
to spark...@googlegroups.com
same issue, with the latest release of spark only...


2013/3/8 Johnpaul Ci <johnp...@gmail.com>

--
You received this message because you are subscribed to the Google Groups "Spark Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-users...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

Josh Rosen

unread,
Mar 8, 2013, 3:35:29 AM3/8/13
to spark...@googlegroups.com
If you look at $SPARK_HOME/example/target/scala-2.9.2/, does that directory include multiple spark-examples-*.jar files?

If it contains JARs from different Spark versions, then the `run` script sets SPARK_EXAMPLES_JAR to an incorrect value, leading to the URI$Parser NullPointerException.

Try manually removing the old spark-examples JARs, or run `sbt/sbt clean package` to perform a clean build.

johnpaul ci

unread,
Mar 8, 2013, 3:53:19 AM3/8/13
to spark...@googlegroups.com
I could',t see any jar files the respective directory (/spark/examples/target/scala-2.9.2) .
 
The scenario is that i downloaded the spark from git repository. and trying to configure it for standalone mode
 
can you suggest which is the stable version and the excat steps for spark installation. The scala which is mentioned in the document is not seen in the scala-lang site.
 
in my conf file(spark-env.sh) i had only included the scala home and java home.
for running it in standalone do i need to set any other variables?
 
and I am performing this installation in VMware Ubuntu 12.0.4.2 of  512MB of RAM?
 
 


 
You received this message because you are subscribed to a topic in the Google Groups "Spark Users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/spark-users/x5UczgI-Xm8/unsubscribe?hl=en.
To unsubscribe from this group and all its topics, send an email to spark-users...@googlegroups.com.

For more options, visit https://groups.google.com/groups/opt_out.
 
 



--
with regards

johnpaul c i

Josh Rosen

unread,
Mar 8, 2013, 4:09:06 AM3/8/13
to spark...@googlegroups.com
I can reproduce the NullPointerException by modifying the `run` script so that SPARK_EXAMPLES_JAR is never set.  I can cause SPARK_EXAMPLES_JAR to never be set by modifying my target directory to contain multiple jars that match the regular expression that's used to find the example jar.  Hence, my suspicion that your /spark/examples/target/ directory contains multiple jars.  To see if this is happening, look in the appropriate target directory for whichever version of Scala you're using; for me, this was /spark/examples/target/scala-2.9.2.

Spark currently supports Scala 2.9.2, although users have reported that it runs fine on 2.9.3 (there are some other mailing list threads discussing efforts to support Scala 2.1.0).

Did you try recompiling with `sbt/sbt clean package`?

johnpaul ci

unread,
Mar 8, 2013, 4:16:14 AM3/8/13
to spark...@googlegroups.com
I tried the command 'sbt/sbt clean package' and issued the command 'sbt/sbt package' . it showed an
the terminal
[info] Loading project definition from /usr/local/spark/ .sbt
[info] Set current project to default-ab29b9 (in build file:/usr/local/spark)
[success] Total time: 0 s, completed Mar ..........

Josh Rosen

unread,
Mar 8, 2013, 4:28:55 AM3/8/13
to spark...@googlegroups.com
It looks like sbt isn't picking up the right project file, so it's failing back on some sort of default project.

I think the cause of your original problem is duplicate JARs in the examples target directory.  I suggested `sbt/sbt clean package` because that should be a foolproof fix, provided that you can build Spark from the source.  The other option is to manually delete the extra JAR files.

You might have wound up with multiple JARs if you used `rsync` to deploy a newer version of Spark on your cluster, but did not use the --delete option.

You could always back up your /usr/local/spark directory, delete it, and perform a fresh install from either the Spark source / GitHub or one of the binary distributions.

Do you see multiple example jars in the examples/target directory?

johnpaul ci

unread,
Mar 8, 2013, 5:43:25 AM3/8/13
to spark...@googlegroups.com
In the folder target/scala-2.9.1/
I could find a jar package Defaultscala.-2.9.1****.jar
 
meanwhile I tried with a fresh set up with spark-0.6.2 and issued the command sbt/sbt package --
 
every thing went on fine but when invoked the spark-shell I encountered the error
 
Execption in thread main( Runtime exeception ...) Cannot figure out how to run the target: spark.repl.main ..
-----------------------------------------
-----------------------------------------
 
I will explain the steps which i have followed can you figure it out where I went wrong.
 
I downloaded the spark-0.6.2 package and placed in the /usr/local
similarly  I had done with this to scala-2.9.2
I had installed java (Openjdk-7)
 
my conf file of spark . spark.env.sh contains two variables
SCALA_HOME and JAVA_HOME
 
after all these I executed the sbt/sbt package ..
 
usually I will encounter an error message telling that there is no provision to access the github and the clonning cannot be done. hence I changed the git:// to https and i could download it .
 
when again I perform sbt/sbt package it shows a success message with 0s ....
 
but after that all when I start the shell
 
errors are appearing.
 
I am using VmWare ubuntu 12.04 of 512 MB ram .size
..

Matei Zaharia

unread,
Mar 8, 2013, 10:44:45 AM3/8/13
to spark...@googlegroups.com
You should be using Scala 2.9.2 instead of 2.9.1, and you should run SBT out of the top-level directory (/usr/local/spark I guess), not out of any subdirectory. Then it will build the correct file. Try it with a new download.

Matei

johnpaul ci

unread,
Mar 10, 2013, 2:59:49 PM3/10/13
to spark...@googlegroups.com
Hi Matei


I could successfully execute the pi program in my laptop ..

i would like to know how the standalone job in scala is done. I made the demo file SimpleJob.scala in the spark home directory. so also the simple.sbt . and executed the sbt packaging and sbt run command. But it shows error. Could you please describe how the standalone job is made, its packaging and execution.

John

johnpaul ci

unread,
Mar 11, 2013, 12:42:53 AM3/11/13
to spark...@googlegroups.com
Hi
 
For making the standalone job I followed the instructions in the documentation as like in terminal prompt
 
I issued the command fine . ./src ./src/main ................
 
but encounters the following error ..
 
find ./ ... no such file or directory
find ./ .. no such file or directory ..

Matei Zaharia

unread,
Mar 11, 2013, 2:21:31 AM3/11/13
to spark...@googlegroups.com
Don't put the files in Spark's directory; it will be confused by existing SBT build in there. Just make a new directory for your project and follow the instructions at http://spark-project.org/docs/latest/quick-start.html

Matei

johnpaul ci

unread,
Mar 11, 2013, 3:35:50 AM3/11/13
to spark...@googlegroups.com
Hi Matei 

Thanks a lot. 

These I can  perform on the spark-0.7.0prebuilt package ..But when I try to build the spark-0.7.0 using sbt/sbt package command it fails showing the error. 
and also some warnings.
I coudnt compile it.

Whether I need to change any proxy settings of my network access. I tried with the option of noproxy .But the error still persits.

Regards
john

johnpaul ci

unread,
Mar 11, 2013, 4:38:03 AM3/11/13
to spark...@googlegroups.com
Hi Rosen

I could run the precompiled version

But when compiling the version spark-0.6.2 (source)

with scala-2.9.2, I am encountering an error root@ubuntu:/usr/local/spark-0.6.2# sbt/sbt clean package
[info] Loading project definition from /usr/local/spark-0.6.2/project/project
[info] Loading project definition from /usr/local/spark-0.6.2/project
[info] Updating {file:/usr/local/spark-0.6.2/project/}plugins...
[info] Resolving eu.henkelmann#junit_xml_listener;0.3 ...
[info] Resolving com.eed3si9n#sbt-assembly;0.8.3 ...
[error] SERVER ERROR: Parent proxy unreacheable url=http://repo.typesafe.com/typesafe/ivy-releases/com.eed3si9n/sbt-assembly/scala_2.9.1/sbt_0.11.3/0.8.3/ivys/ivy.xml
[error] SERVER ERROR: Parent proxy unreacheable url=http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases/com.eed3si9n/sbt-assembly/scala_2.9.1/sbt_0.11.3/0.8.3/ivys/ivy.xml
[error] SERVER ERROR: Parent proxy unreacheable url=http://repo.typesafe.com/typesafe/releases/com/eed3si9n/sbt-assembly_2.9.1_0.11.3/0.8.3/sbt-assembly-0.8.3.pom


The compilation itself went some thing wrong.

Regards
john


Arapat Alafate

unread,
Jun 7, 2013, 4:09:25 AM6/7/13
to spark...@googlegroups.com
Hi,

I encountered this error, too. And I fixed it by a small change to ./run script.

I am working with spark-0.7.2 and scala-2.9.3.

Basically, replace "spark-examples-" in lines 144-147 by "spark-examples_". 

144 if [ -e "$EXAMPLES_DIR/target/scala-$SCALA_VERSION/spark-examples_"!(*sources    |*javadoc) ]; then
145   # Use the JAR from the SBT build
146   export SPARK_EXAMPLES_JAR=`ls "$EXAMPLES_DIR/target/scala-$SCALA_VERSION/spark-examples_"!(*sources|*javadoc).jar`
147 fi

I am not very familiar with sbt, so I can't really explain why the target directory was named incorrectly.

Hope this helps.

baontq

unread,
Jun 11, 2013, 10:48:51 PM6/11/13
to spark...@googlegroups.com
Thanks! it works.      

Vào 15:09:25 UTC+7 Thứ sáu, ngày 07 tháng sáu năm 2013, Arapat Alafate đã viết:

Josh Rosen

unread,
Jun 11, 2013, 10:55:05 PM6/11/13
to spark...@googlegroups.com
I opened an issue to track this bug:

Vadim Semenov

unread,
Jun 12, 2013, 9:36:31 PM6/12/13
to spark...@googlegroups.com
Thanks!

johnpaul ci

unread,
Jun 12, 2013, 10:38:41 PM6/12/13
to spark...@googlegroups.com
Thanks for the solution.

Thanks!
--
You received this message because you are subscribed to a topic in the Google Groups "Spark Users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/spark-users/x5UczgI-Xm8/unsubscribe?hl=en.
To unsubscribe from this group and all its topics, send an email to spark-users...@googlegroups.com.

For more options, visit https://groups.google.com/groups/opt_out.
 
 



--
kind regards

Johnpaul C I

Ambarish Hazarnis

unread,
Jun 25, 2013, 9:13:05 PM6/25/13
to spark...@googlegroups.com
Thanks Arapat :)
Reply all
Reply to author
Forward
0 new messages