Loading Failed Permission Denied Rhipe Installation

105 views
Skip to first unread message

engg.sy...@gmail.com

unread,
Feb 22, 2013, 10:19:53 AM2/22/13
to rh...@googlegroups.com
i am getting the following error when i am running command "R CMD INSTALL Rhipe_0.72.1.tar.gz"

[root@localhost Rhipe]# R CMD INSTALL Rhipe
* installing to library ‘/usr/local/lib/R/library’
* installing *source* package ‘Rhipe’ ...
** libs
g++ -I.  -g -O2  -DUSEAUTOSHORT -DHAVE_UINTPTR_T    `/usr/local/lib/R/bin/R CMD config --cppflags` `pkg-config --cflags protobuf` -fpic `/usr/local/lib/R/bin/R CMD config --ldflags` `pkg-config --libs protobuf`  `/usr/local/lib/R/bin/R CMD config --cppflags` `/usr/local/lib/R/bin/R CMD config --ldflags` rexp.pb.o message.o fileio.o signal.o display.o reducer.o mapper.o mapreduce.o main.o -o ../inst/bin/RhipeMapReduce
chmod 755 ../inst/bin/RhipeMapReduce
g++ -I/usr/local/lib/R/include -DNDEBUG  -I/usr/local/include    -fpic  -g -O2  -I.  -g -O2  -DUSEAUTOSHORT -DHAVE_UINTPTR_T    `/usr/local/lib/R/bin/R CMD config --cppflags` `pkg-config --cflags protobuf` -c serverbridge.cc -o serverbridge.o
g++ -I/usr/local/lib/R/include -DNDEBUG  -I/usr/local/include    -fpic  -g -O2  -I.  -g -O2  -DUSEAUTOSHORT -DHAVE_UINTPTR_T    `/usr/local/lib/R/bin/R CMD config --cppflags` `pkg-config --cflags protobuf` -c rhooks.cc -o rhooks.o
g++ -I/usr/local/lib/R/include -DNDEBUG  -I/usr/local/include    -fpic  -g -O2  -I.  -g -O2  -DUSEAUTOSHORT -DHAVE_UINTPTR_T    `/usr/local/lib/R/bin/R CMD config --cppflags` `pkg-config --cflags protobuf` -c md5.c -o md5.o
g++ -shared -I.  -g -O2  -DUSEAUTOSHORT -DHAVE_UINTPTR_T    `/usr/local/lib/R/bin/R CMD config --cppflags` `pkg-config --cflags protobuf` -fpic `/usr/local/lib/R/bin/R CMD config --ldflags` `pkg-config --libs protobuf`  `/usr/local/lib/R/bin/R CMD config --cppflags` `/usr/local/lib/R/bin/R CMD config --ldflags` rexp.pb.o message.o serverbridge.o fileio.o rhooks.o md5.o -o Rhipe.so
installing to /usr/local/lib/R/library/Rhipe/libs
** R
** inst
** preparing package for lazy loading
** help
*** installing help indices
** building package indices
** testing if installed package can be loaded
Error : .onLoad failed in loadNamespace() for 'Rhipe', details:
  call: dyn.load(file, DLLpath = DLLpath, ...)
  error: unable to load shared object '/usr/local/lib/R/library/Rhipe/libs/Rhipe.so':
  /usr/local/lib/R/lib/libR.so: cannot restore segment prot after reloc: Permission denied
Error: loading failed
Execution halted
ERROR: loading failed
* removing ‘/usr/local/lib/R/library/Rhipe’


i have installed all prerequest successfully including hadoop,R and Protobuf

In the link they have created environment variable name "LD_LIBRARY_PATH" i have declare it like this "LD_LIBRARY_PATH='/usr/local/lib/R/lib'
         And i have following file in this location libR.so,libRblas.so,libRlapack.so am i doing some thing wrong ?

Hopes for your suggestions

Thanks in Advance

Saptarshi Guha

unread,
Feb 22, 2013, 10:25:43 AM2/22/13
to rh...@googlegroups.com
a quick google search tells me that this appears to be some SELinux features in modern day linux distributions.
See http://www.quantumwise.com/support/faq/117-cannot-restore-segment-prot-after-reloc-permission-denied?catid=25%3Ainstallation-issues for an explanation
and you can turn it of, see http://thompsonng.blogspot.com/2012/03/linux-cannot-restore-segment-prot-after.html



--
 
---
You received this message because you are subscribed to the Google Groups "rhipe" group.
To unsubscribe from this group and stop receiving emails from it, send an email to rhipe+un...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

engg.sy...@gmail.com

unread,
Feb 25, 2013, 2:47:37 AM2/25/13
to rh...@googlegroups.com, saptars...@gmail.com
i tried as you mentioned me now i am getting this ,

[root@localhost Rhipe]# R CMD INSTALL Rhipe_0.72.1.tar.gz

* installing to library ‘/usr/local/lib/R/library’
* installing *source* package ‘Rhipe’ ...
** libs
g++ -I/usr/local/lib/R/include -DNDEBUG  -I/usr/local/include    -fpic  -g -O2  -I.  -g -O2  -DUSEAUTOSHORT -DHAVE_UINTPTR_T    `/usr/local/lib/R/bin/R CMD config --cppflags` `pkg-config --cflags protobuf` -c rexp.pb.cc -o rexp.pb.o
g++ -I/usr/local/lib/R/include -DNDEBUG  -I/usr/local/include    -fpic  -g -O2  -I.  -g -O2  -DUSEAUTOSHORT -DHAVE_UINTPTR_T    `/usr/local/lib/R/bin/R CMD config --cppflags` `pkg-config --cflags protobuf` -c message.cc -o message.o
g++ -I/usr/local/lib/R/include -DNDEBUG  -I/usr/local/include    -fpic  -g -O2  -I.  -g -O2  -DUSEAUTOSHORT -DHAVE_UINTPTR_T    `/usr/local/lib/R/bin/R CMD config --cppflags` `pkg-config --cflags protobuf` -c fileio.cc -o fileio.o
g++ -I/usr/local/lib/R/include -DNDEBUG  -I/usr/local/include    -fpic  -g -O2  -I.  -g -O2  -DUSEAUTOSHORT -DHAVE_UINTPTR_T    `/usr/local/lib/R/bin/R CMD config --cppflags` `pkg-config --cflags protobuf` -c signal.cc -o signal.o
g++ -I/usr/local/lib/R/include -DNDEBUG  -I/usr/local/include    -fpic  -g -O2  -I.  -g -O2  -DUSEAUTOSHORT -DHAVE_UINTPTR_T    `/usr/local/lib/R/bin/R CMD config --cppflags` `pkg-config --cflags protobuf` -c display.cc -o display.o
g++ -I/usr/local/lib/R/include -DNDEBUG  -I/usr/local/include    -fpic  -g -O2  -I.  -g -O2  -DUSEAUTOSHORT -DHAVE_UINTPTR_T    `/usr/local/lib/R/bin/R CMD config --cppflags` `pkg-config --cflags protobuf` -c reducer.cc -o reducer.o
g++ -I/usr/local/lib/R/include -DNDEBUG  -I/usr/local/include    -fpic  -g -O2  -I.  -g -O2  -DUSEAUTOSHORT -DHAVE_UINTPTR_T    `/usr/local/lib/R/bin/R CMD config --cppflags` `pkg-config --cflags protobuf` -c mapper.cc -o mapper.o
g++ -I/usr/local/lib/R/include -DNDEBUG  -I/usr/local/include    -fpic  -g -O2  -I.  -g -O2  -DUSEAUTOSHORT -DHAVE_UINTPTR_T    `/usr/local/lib/R/bin/R CMD config --cppflags` `pkg-config --cflags protobuf` -c mapreduce.cc -o mapreduce.o
g++ -I/usr/local/lib/R/include -DNDEBUG  -I/usr/local/include    -fpic  -g -O2  -I.  -g -O2  -DUSEAUTOSHORT -DHAVE_UINTPTR_T    `/usr/local/lib/R/bin/R CMD config --cppflags` `pkg-config --cflags protobuf` -c main.cc -o main.o

g++ -I.  -g -O2  -DUSEAUTOSHORT -DHAVE_UINTPTR_T    `/usr/local/lib/R/bin/R CMD config --cppflags` `pkg-config --cflags protobuf` -fpic `/usr/local/lib/R/bin/R CMD config --ldflags` `pkg-config --libs protobuf`  `/usr/local/lib/R/bin/R CMD config --cppflags` `/usr/local/lib/R/bin/R CMD config --ldflags` rexp.pb.o message.o fileio.o signal.o display.o reducer.o mapper.o mapreduce.o main.o -o ../inst/bin/RhipeMapReduce
chmod 755 ../inst/bin/RhipeMapReduce
g++ -I/usr/local/lib/R/include -DNDEBUG  -I/usr/local/include    -fpic  -g -O2  -I.  -g -O2  -DUSEAUTOSHORT -DHAVE_UINTPTR_T    `/usr/local/lib/R/bin/R CMD config --cppflags` `pkg-config --cflags protobuf` -c serverbridge.cc -o serverbridge.o
g++ -I/usr/local/lib/R/include -DNDEBUG  -I/usr/local/include    -fpic  -g -O2  -I.  -g -O2  -DUSEAUTOSHORT -DHAVE_UINTPTR_T    `/usr/local/lib/R/bin/R CMD config --cppflags` `pkg-config --cflags protobuf` -c rhooks.cc -o rhooks.o
g++ -I/usr/local/lib/R/include -DNDEBUG  -I/usr/local/include    -fpic  -g -O2  -I.  -g -O2  -DUSEAUTOSHORT -DHAVE_UINTPTR_T    `/usr/local/lib/R/bin/R CMD config --cppflags` `pkg-config --cflags protobuf` -c md5.c -o md5.o
g++ -shared -I.  -g -O2  -DUSEAUTOSHORT -DHAVE_UINTPTR_T    `/usr/local/lib/R/bin/R CMD config --cppflags` `pkg-config --cflags protobuf` -fpic `/usr/local/lib/R/bin/R CMD config --ldflags` `pkg-config --libs protobuf`  `/usr/local/lib/R/bin/R CMD config --cppflags` `/usr/local/lib/R/bin/R CMD config --ldflags` rexp.pb.o message.o serverbridge.o fileio.o rhooks.o md5.o -o Rhipe.so
installing to /usr/local/lib/R/library/Rhipe/libs
** R
** inst
** preparing package for lazy loading
** help
*** installing help indices
** building package indices
** testing if installed package can be loaded
Warning in onload.2(libname, pkgname) :
  Rhipe requires the HADOOP or HADOOP_BIN environment variable to be present
 $HADOOP/bin/hadoop or $HADOOP_BIN/hadoop should exists
Warning in onload.2(libname, pkgname) :
  Rhipe: HADOOP_BIN is missing, using $HADOOP/bin
--------------------------------------------------------
| IMPORTANT: Before using Rhipe call rhinit()           |
| Rhipe will not work or most probably crash            |
--------------------------------------------------------

* DONE (Rhipe)
[root@localhost Rhipe]# echo $HADOOP_BIN

[root@localhost Rhipe]# echo $HADOOP

[root@localhost Rhipe]# export HADOOP=/home/hadoop/project/hadoop-1.0.4/bin
[root@localhost Rhipe]# export HADOOP_BIN=/home/hadoop/project/hadoop-1.0.4/bin
[root@localhost Rhipe]# R

R version 2.15.1 (2012-06-22) -- "Roasted Marshmallows"
Copyright (C) 2012 The R Foundation for Statistical Computing
ISBN 3-900051-07-0
Platform: i686-pc-linux-gnu (32-bit)

R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.

  Natural language support but running in an English locale

R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.

Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.

> library(Rhipe)
--------------------------------------------------------
| IMPORTANT: Before using Rhipe call rhinit()           |
| Rhipe will not work or most probably crash            |
--------------------------------------------------------
> rhinit()
13/02/26 12:39:40 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 0 time(s).
13/02/26 12:39:41 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 1 time(s).
13/02/26 12:39:42 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 2 time(s).
13/02/26 12:39:43 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 3 time(s).
13/02/26 12:39:44 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 4 time(s).
13/02/26 12:39:45 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 5 time(s).
13/02/26 12:39:46 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 6 time(s).
13/02/26 12:39:47 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 7 time(s).
13/02/26 12:39:48 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 8 time(s).
13/02/26 12:39:49 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 9 time(s).
Exception in thread "main" java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed on connection exception: java.net.ConnectException: Connection refused
        at org.apache.hadoop.ipc.Client.wrapException(Client.java:1099)
        at org.apache.hadoop.ipc.Client.call(Client.java:1075)
        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
        at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
        at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
        at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
        at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
        at org.godhuli.rhipe.PersonalServer.run(PersonalServer.java:755)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.godhuli.rhipe.PersonalServer.main(PersonalServer.java:772)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.net.ConnectException: Connection refused
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
        at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:599)
        at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:489)
        at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:434)
        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:560)
        at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:184)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1206)
        at org.apache.hadoop.ipc.Client.call(Client.java:1050)
        ... 21 more

two days before i did all installation successfully as i am getting all two ports up but now when i go forward with your openion i get this error again :(

   when i got this error before i did all installation from the begning including RHEL 5 installation

Hopes for your suggestion

Thanks

Ashrith

unread,
Feb 25, 2013, 9:40:46 AM2/25/13
to rh...@googlegroups.com, saptars...@gmail.com
Is it a localmachine install? If so can you check the scope of the server? Do a netstat -atn and see if the scope of your namenode is 127.0.0.1 or your external Ip or 0.0.0.0 . This might sound silly but could you also check if you have started it? 
Reply all
Reply to author
Forward
0 new messages