Alluxio and HDFS with "simple" authentication

165 views
Skip to first unread message

shuy...@gmail.com

unread,
Aug 1, 2016, 5:46:52 AM8/1/16
to Alluxio Users
Hi Guys,

Alluxio version: 1.2.0 against hadoop 2.6.0.
Hadoop version: 2.6.0.

My HDFS cluster enabled authentication with "hadoop.security.authentication=simple", and Alluxio is running by user "ant", for some dir got permission error like belowing, how can i resolve it, thank you !

Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=ant, access=WRITE, inode="/test/data/web":hduser:hadoop:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:271)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:257)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:238)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:182)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6812)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:4255)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:4207)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:4191)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(NameNodeRpcServer.java:837)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTranslatorPB.java:603)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)

at org.apache.hadoop.ipc.Client.call(Client.java:1469)
at org.apache.hadoop.ipc.Client.call(Client.java:1400)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
at com.sun.proxy.$Proxy9.delete(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.delete(ClientNamenodeProtocolTranslatorPB.java:521)
at sun.reflect.GeneratedMethodAccessor12.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy10.delete(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.delete(DFSClient.java:1918)
... 21 more

Chaomin Yu

unread,
Aug 1, 2016, 12:41:50 PM8/1/16
to shuy...@gmail.com, Alluxio Users
Hi,

From the HDFS error message, path "/test/data/web" has owner=hduser, group=hadoop, permission=drwxr-xr-x, in HDFS namespace.
Alluxio running by user "ant", does not have WRITE permission to this HDFS path. This permission denied error is be expected.

You can either
- make Alluxio run and login as hduser
- or modify the permission of "/test/data/web" to writable by "ant", such as "drwxr-xrwx"

Hope this helps,
Chaomin

--
You received this message because you are subscribed to the Google Groups "Alluxio Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to alluxio-user...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.



--
Cheers,
Chaomin

shuy...@gmail.com

unread,
Aug 1, 2016, 9:41:51 PM8/1/16
to Alluxio Users, shuy...@gmail.com
Hi ,

Thanks for your help.

But HDFS cluster needs to enable authentication based on Posix model, different user can only operate the data allowed.

Does Alluxio provide a way to resolve this?

在 2016年8月2日星期二 UTC+8上午12:41:50,Chaomin Yu写道:

Chaomin Yu

unread,
Aug 1, 2016, 11:25:28 PM8/1/16
to shuy...@gmail.com, Alluxio Users
Yes, Alluxio works with HDFS "hadoop.security.authentication=simple".
You can run Alluxio in security enabled mode. You will find this Alluxio security documentation helpful.

Can you try to set the following configuration?
  • alluxio.security.authentication.type=SIMPLE
  • alluxio.security.authorization.permission.enabled=true
  • alluxio.security.login.username=<username who has access to your HDFS path>
Note that you should set Alluxio client login user to be accessible to the particular HDFS path.
Otherwise it's expected to see such permission denied errors. (In your case, user "ant" does not have WRITE permission to "/test/data/web")

Hope this helps,
Chaomin

shuy...@gmail.com

unread,
Aug 2, 2016, 1:44:05 AM8/2/16
to Alluxio Users

I understand, thank you very much.

在 2016年8月1日星期一 UTC+8下午5:46:52,shuy...@gmail.com写道:
Reply all
Reply to author
Forward
0 new messages