Yes, it works. Hadoop, HBase, etc. clients called by gremlin-console and gremlin-server will find any locally stored kerberos credentials pointed to by KRB5CCNAME, KRB5_KTNAME and KRB5_CONFIG. You can set: export KRB5_TRACE=/dev/stdout to get more debug info if any problems arise.
In TinkerPop 3.3 gremlin-server will also have a kerberos authenticator for kerberized access to gremlin-server. A PR for this, written by me, has been merged recently into TinkerPop.
KRB5_CONFIG=/etc/krb5.conf
KRB5CCNAME=/tmp/krb5cc_1003
KRB5_KTNAME=/etc/security/keytabs/gremlin.keytabEnter code here...
gremlin@jkovacs-VirtualBox:/opt/janusgraph-0.2.0-hadoop2$ klist
Ticket cache: FILE:/tmp/krb5cc_1003
Default principal: gremlin/jkovacs-VirtualBox@EXAMPLE.COM
Valid starting Expires Service principal
18.1.2018 12:37:26 18.1.2018 22:37:26 krbtgt/EXAMPLE.COM@EXAMPLE.COM
renew until 19.1.2018 12:37:25
gremlin@jkovacs-VirtualBox:/opt/janusgraph-0.2.0-hadoop2$ gremlin@jkovacs-VirtualBox:/opt/janusgraph-0.2.0-hadoop2/conf$ cat java.env
export JVMFLAGS="-Djava.security.auth.login.config=/opt/janusgraph-0.2.0-hadoop2/conf/jaas.conf"
gremlin@jkovacs-VirtualBox:/opt/janusgraph-0.2.0-hadoop2/conf$ cat jaas.conf
Server {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
keyTab="/etc/security/keytabs/gremlin.keytab"
storeKey=true
useTicketCache=false
principal="gremlin/jkovacs-V...@EXAMPLE.COM";
};
Client {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
keyTab="/etc/security/keytabs/gremlin.keytab"
storeKey=true
useTicketCache=false
principal="gremlin/jkovacs-V...@EXAMPLE.COM";
};
gremlin@jkovacs-VirtualBox:/opt/janusgraph-0.2.0-hadoop2/conf$
Hi Marc,
};
Client {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
keyTab="/etc/security/keytabs/gremlin.keytab"
storeKey=true
useTicketCache=false
principal="gremlin/jkovacs-Virtu...@EXAMPLE.COM";
};
gremlin@jkovacs-VirtualBox:/opt/janusgraph-0.2.0-hadoop2/conf$
If this seems all OK, delete your table (assuming it is empty now and contains no valuable data) using hbase shell and restart from scratch with opening the table from gremlin-console. Just to be sure that you start with a clean situation.
Disclaimer: I do not understand how your situation arose, I just mention some configs from my own setup with HDP-2.6.2 which I think might be relevant.
Cheers, Marc
jkovacs@hadoop.lan:[/opt/janusgraph-0.2.0-hadoop2]: ./bin/gremlin.sh
\,,,/
(o o)
-----oOOo-(3)-oOOo-----
plugin activated: janusgraph.imports
plugin activated: tinkerpop.server
plugin activated: tinkerpop.utilities
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/janusgraph-0.2.0-hadoop2/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/janusgraph-0.2.0-hadoop2/lib/logback-classic-1.1.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
09:07:25 WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" org.apache.tinkerpop.gremlin.groovy.plugin.PluginInitializationException: No FileSystem for scheme: hdfs
at org.apache.tinkerpop.gremlin.hadoop.groovy.plugin.HadoopGremlinPlugin.afterPluginTo(HadoopGremlinPlugin.java:91)
at org.apache.tinkerpop.gremlin.groovy.plugin.AbstractGremlinPlugin.pluginTo(AbstractGremlinPlugin.java:86)
at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:232)
at org.apache.tinkerpop.gremlin.console.PluggedIn.activate(PluggedIn.groovy:58)
at org.apache.tinkerpop.gremlin.console.Console$_closure19.doCall(Console.groovy:146)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:294)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1022)
at groovy.lang.Closure.call(Closure.java:414)
at groovy.lang.Closure.call(Closure.java:430)
at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:2040)
at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:2025)
at org.codehaus.groovy.runtime.dgm$158.doMethodInvoke(Unknown Source)
at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:232)
at org.apache.tinkerpop.gremlin.console.Console.<init>(Console.groovy:133)
at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:232)
at org.apache.tinkerpop.gremlin.console.Console.main(Console.groovy:478)
Caused by: java.io.IOException: No FileSystem for scheme: hdfs
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2644)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2651)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:92)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2687)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2669)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:371)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:170)
at org.apache.tinkerpop.gremlin.hadoop.groovy.plugin.HadoopGremlinPlugin.afterPluginTo(HadoopGremlinPlugin.java:84)
... 21 more
jkovacs@hadoop.lan:[/opt/janusgraph-0.2.0-hadoop2]: echo $CLASSPATH
/usr/hdp/current/hadoop-client/conf:/usr/hdp/current/hbase-client/conf
jkovacs@hadoop.lan:[/opt/janusgraph-0.2.0-hadoop2]: echo $PATH
/usr/lib64/qt-3.3/bin:/opt/Python-3.6.3/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/opt/dell/srvadmin/bin:/home/jkovacs/bingremlin.graph=org.janusgraph.core.JanusGraphFactory
storage.backend=hbase
storage.hostname=<list-of-zookeeper-servers>
storage.hbase.table=janusgraf
chmod 664 lib/hadoop-hdfs-2.7.2.jarcache.db-cache-time= 180000
cache.db-cache-size= 0.5
gremlin.graph=org.janusgraph.core.JanusGraphFactory
cache.db-cache-clean-wait= 20
storage.hbase.table=janusgraph_default_autodeploy
storage.hostname= YOUR_HBASE_HOSTNAME
cache.db-cache= true
storage.backend=hbase
storage.hbase.ext.hbase.zookeeper.quorum= YOUR-ZOOKEEPER-HOSTNAME
storage.hbase.ext.zookeeper.znode.parent= /hbase-secure
storage.hbase.ext.hbase.zookeeper.property.clientPort= 2181
storage.hbase.ext.hadoop.security.authentication= kerberos
storage.hbase.ext.hadoop.security.authorization= true
storage.hbase.ext.hbase.security.authentication= kerberos
storage.hbase.ext.hbase.security.authorization= true
java.security.krb5.conf=/etc/krb5.conf
# bin/gremlin.sh
gremlin> graph = JanusGraphFactory.open('conf/your-janusgraph.properties')
gremlin> g = graph.traversal()
gremlin> g.addV().property('name','test')
==>v[4296]
gremlin> g.tx().commit()
==>null