rhive.connect write perm error

73 views
Skip to first unread message

YP

unread,
May 27, 2014, 6:00:00 PM5/27/14
to rh...@googlegroups.com

Why it needs write permission in root folder? Can I change

rhive.connect(host = "xxx", port = xxx);
No encryption was performed by peer.
No encryption was performed by peer.
Error: org.apache.hadoop.security.AccessControlException: Permission denied: user=xxx, access=WRITE, inode="/":hdfs:supergroup:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:224)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:204)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:149)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4846)

김영우

unread,
May 27, 2014, 10:12:37 PM5/27/14
to rh...@googlegroups.com
Hi YP,

Which version of RHive are you running? You should create a 'scratch' directory for RHive in HDFS. 

e.g.,

# sudo -u hdfs hadoop fs -mkdir /rhive
# sudo -u hdfs hadoop fs -chmod 777 /rhive

Hope this helps.

- Youngwoo 

2014년 5월 28일 수요일 오전 7시 0분 0초 UTC+9, YP 님의 말:
Reply all
Reply to author
Forward
0 new messages