createSparkContext on edu.berkeley.cs.amplab.sparkr.RRDD failed --> SparkR(SparkR-pkg)

60 views
Skip to first unread message

Shiva Ram

unread,
Sep 9, 2015, 6:01:12 AM9/9/15
to SparkR Developers
Hi All,

I am using Cloudera CDH(2.6.0-cdh5.4.5) with Spark 1.3.0.
I want to use SparkR, so I downloaded SparkR-pkg from https://github.com/amplab-extras/SparkR-pkg/tree/master

Then I have built using following steps:

./SparkR_prep-0.1.sh

sudo USE_YARN=1 SPARK_VERSION=1.3.0 SPARK_HADOOP_VERSION=2.6.0-cdh5.4.5 SPARK_YARN_VERSION=2.6.0 ./install-dev.sh


build was successful as mentioned below,

////////////////////////////////////////////////

installing to /home/virtusa/SparkR_WorkArea/SparkR-pkg/lib/SparkR/libs
** R
** inst
** preparing package for lazy loading
Creating a generic function for ‘lapply’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘Filter’ from package ‘base’ in package ‘SparkR’
** help
*** installing help indices
** building package indices
** testing if installed package can be loaded
* DONE (SparkR)

///////////////////////////////////////////////

When I am trying to use SparkR as mentioned below,

./sparkR

I am getting following error:

[SparkR] Initializing with classpath /home/virtusa/SparkR_WorkArea/SparkR-pkg/lib/SparkR/sparkr-assembly-0.1.jar

Launching java with command  java   -Xmx512m -cp '/home/virtusa/SparkR_WorkArea/SparkR-pkg/lib/SparkR/sparkr-assembly-0.1.jar:' edu.berkeley.cs.amplab.sparkr.SparkRBackend /tmp/Rtmpq44nHN/backend_port3b7e2ee7f5ab
createSparkContext on edu.berkeley.cs.amplab.sparkr.RRDD failed with java.lang.reflect.InvocationTargetException
java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:622)
    at edu.berkeley.cs.amplab.sparkr.SparkRBackendHandler.handleMethodCall(SparkRBackendHandler.scala:111)
    at edu.berkeley.cs.amplab.sparkr.SparkRBackendHandler.channelRead0(SparkRBackendHandler.scala:58)
    at edu.berkeley.cs.amplab.sparkr.SparkRBackendHandler.channelRead0(SparkRBackendHandler.scala:19)
    at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
    at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
    at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:163)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
    at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:787)
    at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:130)
    at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
    at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
    at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137)
    at java.lang.Thread.run(Thread.java:701)
Caused by: java.lang.UnsupportedClassVersionError: org/apache/hadoop/fs/FSDataInputStream : Unsupported major.minor version 51.0
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:643)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
    at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:323)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:268)
    at org.apache.spark.SparkConf.<init>(SparkConf.scala:58)
    at org.apache.spark.SparkConf.<init>(SparkConf.scala:52)
    at edu.berkeley.cs.amplab.sparkr.RRDD$.createSparkContext(RRDD.scala:340)
    at edu.berkeley.cs.amplab.sparkr.RRDD.createSparkContext(RRDD.scala)
    ... 25 more
Error: returnStatus == 0 is not TRUE

How to resolve this issue??? Thanks.

My java version is:

java version "1.6.0_36"
OpenJDK Runtime Environment (IcedTea6 1.13.8) (6b36-1.13.8-0ubuntu1~12.04)
OpenJDK 64-Bit Server VM (build 23.25-b01, mixed mode)

Shivaram Venkataraman

unread,
Sep 9, 2015, 3:49:06 PM9/9/15
to Shiva Ram, SparkR Developers
The SparkR project is now a part of the mainline Apache Spark. Please use Spark > 1.4.0 (1.5.0 was released today http://spark.apache.org/downloads.html) to get the latest SparkR version

Thanks
Shivaram

--
You received this message because you are subscribed to the Google Groups "SparkR Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to sparkr-dev+...@googlegroups.com.
To post to this group, send email to spark...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/sparkr-dev/7a076cba-e266-494d-8abd-e6722bd3fbf3%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Shiva Ram

unread,
Sep 10, 2015, 12:21:12 AM9/10/15
to SparkR Developers, shivaram....@gmail.com
Hi,

Thanks for your reply Shivaram.

If I download the latest mainline Apache Spark(1.4.1 or 1.5.0), how to use with or build with 2.6.0-cdh5.4.5 because my Hadoop cluster version is 2.6.0-cdh5.4.5. Basically I want use SparkR with Cloudera CDH YARN. Thanks.

Shivaram Venkataraman

unread,
Sep 10, 2015, 2:20:07 AM9/10/15
to Shiva Ram, SparkR Developers
Unfortunately I don't know much about CDH version mapping. Please email the Spark user mailing list (http://spark.apache.org/community.html) or you can post this on CDH forums 

Thanks
Shivaram

Reply all
Reply to author
Forward
0 new messages