error building spark

229 views
Skip to first unread message

Sudeep Bhattarai

unread,
Oct 24, 2013, 12:33:16 AM10/24/13
to spark...@googlegroups.com
I am trying to install spark on cent os. While building spark using  sbt/sbt assembly command it gives following error. 

    [warn] /root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala:129: method cleanupJob in class OutputCommitter is deprecated: see corresponding Javadoc for more information.
    [warn]     getOutputCommitter().cleanupJob(getJobContext())
    [warn]                          ^
    [warn] /root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala:592: method cleanupJob in class OutputCommitter is deprecated: see corresponding Javadoc for more information.
    [warn]     jobCommitter.cleanupJob(jobTaskContext)
    [warn]                  ^
    [warn] two warnings found
    [error] ----------
    [error] 1. WARNING in /root/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileClient.java (at line 22)
    [error]         import io.netty.channel.ChannelFuture;
    [error]                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    [error] The import io.netty.channel.ChannelFuture is never used
    [error] ----------
    [error] 2. WARNING in /root/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileClient.java (at line 23)
    [error]         import io.netty.channel.ChannelFutureListener;
    [error]                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    [error] The import io.netty.channel.ChannelFutureListener is never used
    [error] ----------
    [error] ----------
    [error] 3. WARNING in /root/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileServer.java (at line 23)
    [error]         import io.netty.channel.Channel;
    [error]                ^^^^^^^^^^^^^^^^^^^^^^^^
    [error] The import io.netty.channel.Channel is never used
    [error] ----------
    [error] ----------
    [error] 4. WARNING in /root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/api/java/JavaSparkContextVarargsWorkaround.java (at line 20)
    [error]         import java.util.Arrays;
    [error]                ^^^^^^^^^^^^^^^^
    [error] The import java.util.Arrays is never used
    [error] ----------
    [error] ----------
    [error] 5. ERROR in /root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/api/java/function/DoubleFlatMapFunction.java (at line 36)
    [error]         public final Iterable<Double> apply(T t) { return call(t); }
    [error]                                       ^^^^^^^^^^
    [error] The method apply(T) of type DoubleFlatMapFunction<T> must override a superclass method
    [error] ----------
    [error] 5 problems (1 error, 4 warnings)
    [error] (core/compile:compile) javac returned nonzero exit code
    [error] Total time: 431 s, completed Oct 24, 2013 7:42:21 AM


The version of java installed on my machine is 1.6.0_35.  Earlier I used jdk 1.4. Which version of java should I use? Or is it some other issue ?   

Sudeep Bhattarai

unread,
Oct 24, 2013, 12:45:13 AM10/24/13
to spark...@googlegroups.com
I also tried with jdk 1.7.0_45, which gave the same set of errors.  

Patrick Wendell

unread,
Oct 24, 2013, 1:31:04 AM10/24/13
to spark...@googlegroups.com
How are you setting the JDK when you build? This looks a lot like an older version of Java is being pulled in by sbt. We need at least 1.6, but perhaps somehow sbt is pulling in that older version.


--
You received this message because you are subscribed to the Google Groups "Spark Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-users...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

Sudeep Bhattarai

unread,
Oct 25, 2013, 3:14:11 AM10/25/13
to spark...@googlegroups.com
Thanks Patrick, 
The issue is solved now.
Build is successful, but I am again having next issue. The spark-shell is not working throwing unknown host exception. 
=====================================================================

[root@jbosstest1 spark-0.8.0-incubating]# ./spark-shell
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 0.8.0
      /_/

Using Scala version 2.9.3 (Java HotSpot(TM) 64-Bit Server VM, Java 1.6.0_35)
Initializing interpreter...
13/10/25 12:52:51 INFO server.Server: jetty-7.x.y-SNAPSHOT
13/10/25 12:52:51 INFO server.AbstractConnector: Started SocketC...@0.0.0.0:41915
Exception in thread "main" java.net.UnknownHostException: jbosstest1: jbosstest1
        at java.net.InetAddress.getLocalHost(InetAddress.java:1360)
        at org.apache.spark.util.Utils$.findLocalIpAddress(Utils.scala:341)
        at org.apache.spark.util.Utils$.localIpAddress(Utils.scala:333)
        at org.apache.spark.HttpServer.uri(HttpServer.scala:86)
        at org.apache.spark.repl.SparkIMain.<init>(SparkIMain.scala:107)
        at org.apache.spark.repl.SparkILoop$SparkILoopInterpreter.<init>(SparkILoop.scala:128)
        at org.apache.spark.repl.SparkILoop.createInterpreter(SparkILoop.scala:155)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:865)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:905)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)

======================================================================
Reply all
Reply to author
Forward
0 new messages