error in rhinit()

390 views
Skip to first unread message

nikhil yadav

unread,
Jan 15, 2014, 7:33:00 AM1/15/14
to rh...@googlegroups.com
Greetings
 i am using  opensuse 13.1 64 bit , i successfully installed Rhipe , but when i start using it on R promt  , and type rhinit()  ;; it give me error

Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl,  :
  java.lang.UnsupportedOperationException: This is supposed to be overridden by subclasses.


i already google alot , didn't find any solution.

please help

Ryan Ward Kelley

unread,
Feb 4, 2014, 4:36:58 PM2/4/14
to rh...@googlegroups.com
Hi All,

I also just installed hadoop/RHIPE on my MAC 10.7.5 machine and I'm getting the exact same error when I run rhinit().

using:
  • hadoop 2.2.0
  • java version "1.6.0_65"
  • R version 3.0.2
  • protobuf version 2.4.1
  • RHIPE version 73.1.5
Exact error message:
> library(Rhipe)
------------------------------------------------
| Please call rhinit() else RHIPE will not run |
------------------------------------------------
> rhinit()
Rhipe: Using Rhipe.jar file
Initializing Rhipe v0.73
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/Cellar/hadoop/2.2.0/libexec/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/Cellar/hadoop/2.2.0/libexec/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2014-02-04 13:28:45.237 R[1519:b07] Unable to load realm info from SCDynamicStore
14/02/04 13:28:45 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/02/04 13:28:45 INFO Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
14/02/04 13:28:45 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl,  : 
  java.lang.UnsupportedOperationException: This is supposed to be overridden by subclasses.

I'm new to java and RHIPE.  Can anyone parse this?

Thanks,
Ryan

Saptarshi Guha

unread,
Feb 4, 2014, 10:19:54 PM2/4/14
to rh...@googlegroups.com

I've moved to osx recently so I can try and confirm if this is working. I'll get back to you tomorrow.

--
 
---
You received this message because you are subscribed to the Google Groups "rhipe" group.
To unsubscribe from this group and stop receiving emails from it, send an email to rhipe+un...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

Ryan Ward Kelley

unread,
Feb 6, 2014, 12:50:05 PM2/6/14
to rh...@googlegroups.com
Hi Saptarshi,

Any luck?

-Ryan


--
 
---
You received this message because you are subscribed to a topic in the Google Groups "rhipe" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/rhipe/eL_2H5tNi4c/unsubscribe.
To unsubscribe from this group and all its topics, send an email to rhipe+un...@googlegroups.com.

Saptarshi Guha

unread,
Feb 6, 2014, 1:17:13 PM2/6/14
to rh...@googlegroups.com
Ah, sorry, will try this today.
Been busy at work.
Cheers
Saptarshi

Saptarshi Guha

unread,
Feb 6, 2014, 7:25:54 PM2/6/14
to rh...@googlegroups.com
My bash profile looks like

HADOOP=/usr/local/hadoop/
export HADOOP_LIBS=${HADOOP}/share/hadoop/mapreduce1/:${HADOOP}/share/hadoop/mapreduce1/lib
export HADOOP_LIBS=$HADOOP_LIBS:${HADOOP}/share/hadoop/hdfs/:${HADOOP}/share/hadoop/common/
export HADOOP_LIBS=$HADOOP_LIBS:${HADOOP}/share/hadoop/httpfs/:${HADOOP}/share/hadoop/tools/
export HADOOP_HOME=/usr/local/hadoop/
export HADOOP_CONF_DIR=/usr/local/conf.mango/hadoop


I used 0.73-3 (though -5 should work too) from http://ml.stat.purdue.edu/rhipebin/Rhipe_0.73.1-3.tar.gz. I have not tested this with Apache Hadoop (the rearch branch is being made to work with Apache Hadoop)



export RHIPE_USE_CDH4=yes
R

> library(Rhipe)
Rhipe: HADOOP_BIN is missing, using $HADOOP/bin

------------------------------------------------
| Please call rhinit() else RHIPE will not run |
------------------------------------------------
> rhinit()
Rhipe: Using RhipeCDH4.jar

Initializing Rhipe v0.73
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/mapreduce1/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2014-02-06 16:24:02.676 R[27595:d07] Unable to load realm info from SCDynamicStore
14/02/06 16:24:02 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2014-02-06 16:24:02.774 R[27595:d07] Unable to load realm info from SCDynamicStore
Initializing mapfile caches




Archana

unread,
Feb 7, 2014, 2:00:08 AM2/7/14
to rh...@googlegroups.com


I was also getting similar issues when I was trying to use the hadoop2.2 version, but when try to use the earlier version it is working perfectly

Ryan Ward Kelley

unread,
Feb 7, 2014, 11:52:04 AM2/7/14
to rh...@googlegroups.com
Ok, thank you very much -- I will give this a try.

Just to be clear, you used a different version of hadoop from the link you provided and that worked.  Apache hadoop is not supported by RHIPE at this time.  Is this correct?

Thanks,
Ryan

Saptarshi Guha

unread,
Feb 7, 2014, 12:06:26 PM2/7/14
to rh...@googlegroups.com
Yes, we initially we tested on Apache, but to make it work with CDH4 i introduced a change which made it incompatible to Apache!

We have fixed this glitch but it is not released yet.

Does the website say to use Apache??? Thats needs to be changed asap.

Yes, please do test and reply here. I can help if you have issues.
Cheers
Sapytarshi

Ryan Ward Kelley

unread,
Feb 7, 2014, 12:09:56 PM2/7/14
to rh...@googlegroups.com
Oh, sorry.  I was using a book I down loaded: http://www.packtpub.com/big-data-analytics-with-r-and-hadoop/book

It's instructions say to use hadoop but it is a bit older.  I will switch to CDH4 and give it a try.

Thank you again,
Ryan

Saptarshi Guha

unread,
Feb 7, 2014, 12:29:56 PM2/7/14
to rh...@googlegroups.com
hmm, some tech books get outdated very quickly. The mailling list is fairly helpful.

salah bensator

unread,
Mar 27, 2015, 3:14:46 PM3/27/15
to rh...@googlegroups.com
I Have exactly the same problem. 
I didn't get it, you were able to fix it ? 
Is there a way to fix the problem without using CDH?
Reply all
Reply to author
Forward
0 new messages