HDFS Connector Kerberos ticket relogin

252 views
Skip to first unread message

Vladimir

unread,
Mar 23, 2017, 10:09:36 AM3/23/17
to Confluent Platform
Hello everyone! 
Recently I started to investigate Kafka Connect and how I can move our data from Kafka to HDFS. And we found it - Kafka Connect + HDFS Connector. Cool. But here is the problem.
After Kerberos ticket renew period ends (1 week for example) HDFS connector failed to get new one, but it tries... So, here is the log. Hope some one can help :)

[2017-03-21 11:28:51,565] DEBUG Initiating logout for ${username}@${realm} (org.apache.hadoop.security.UserGroupInformation:1054)
[2017-03-21 11:28:51,565] DEBUG hadoop logout (org.apache.hadoop.security.UserGroupInformation:217)
[2017-03-21 11:28:51,566] DEBUG Initiating re-login for ${username}@${realm} (org.apache.hadoop.security.UserGroupInformation:1066)
[2017-03-21 11:28:51,569] DEBUG hadoop login (org.apache.hadoop.security.UserGroupInformation:209)
[2017-03-21 11:28:51,569] DEBUG hadoop login commit (org.apache.hadoop.security.UserGroupInformation:144)
[2017-03-21 11:28:51,569] DEBUG using existing subject:[${username}@${realm}, UnixPrincipal: root, UnixNumericUserPrincipal: 0, UnixNumericGroupPrincipal [Primary Group]: 0] (org.apache.hadoop.security.UserGroupInformation:149)
[2017-03-21 11:28:51,779] DEBUG PrivilegedAction as:${username}@${realm} (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:717) (org.apache.hadoop.security.UserGroupInformation:1652)
[2017-03-21 11:28:51,779] DEBUG Sending sasl message state: NEGOTIATE
 
(org.apache.hadoop.security.SaslRpcClient:457)
[2017-03-21 11:28:51,779] DEBUG Received SASL message state: NEGOTIATE
auths
{
  method
: "TOKEN"
  mechanism
: "DIGEST-MD5"
  protocol
: ""
  serverId
: "default"
  challenge
: "realm=\"default\",nonce=\"${some-hash}/eOTgl\",qop=\"auth\",charset=utf-8,algorithm=md5-sess"
}
auths
{
  method
: "KERBEROS"
  mechanism
: "GSSAPI"
  protocol
: "hdfs"
  serverId
: "${host}.${realm}"
}
 
(org.apache.hadoop.security.SaslRpcClient:389)
[2017-03-21 11:28:51,779] DEBUG Get token info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.token.TokenInfo(value=class org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector) (org.apache.hadoop.security.SaslRpcClient:264)
[2017-03-21 11:28:51,780] DEBUG Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal) (org.apache.hadoop.security.SaslRpcClient:291)
[2017-03-21 11:28:51,780] DEBUG getting serverKey: dfs.namenode.kerberos.principal conf value: hdfs/_HOST@${realm} principal: hdfs/${host}.${realm}@${realm} (org.apache.hadoop.security.SaslRpcClient:318)
[2017-03-21 11:28:51,780] DEBUG RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/${host}.${realm}@${realm} (org.apache.hadoop.security.SaslRpcClient:236)
[2017-03-21 11:28:51,780] DEBUG Creating SASL GSSAPI(KERBEROS)  client to authenticate to service at ${host}.${realm} (org.apache.hadoop.security.SaslRpcClient:247)
[2017-03-21 11:28:51,780] DEBUG Use KERBEROS authentication for protocol ClientNamenodeProtocolPB (org.apache.hadoop.security.SaslRpcClient:176)
[2017-03-21 11:28:51,780] DEBUG PrivilegedActionException as:${username}@${realm} (auth:KERBEROS) cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] (org.apache.hadoop.security.UserGroupInformation:1632)
[2017-03-21 11:28:51,780] DEBUG PrivilegedAction as:${username}@${realm} (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:643) (org.apache.hadoop.security.UserGroupInformation:1652)
[2017-03-21 11:28:51,781] DEBUG Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] (org.apache.hadoop.ipc.Client:652)

I also create issue on Github: https://github.com/confluentinc/kafka-connect-hdfs/issues/178

Vladimir

unread,
Mar 24, 2017, 4:09:06 AM3/24/17
to Confluent Platform
Looks like this is the same issue as:

четверг, 23 марта 2017 г., 17:09:36 UTC+3 пользователь Vladimir написал:
Reply all
Reply to author
Forward
0 new messages