Hi all,
did anyone already try ot store the data using the OpenSOC's HBaseBolt? And if yes did you succeed?
In my case, the following error occurs and I wonder whether it's a cluster configuration problem or if there are problems in OpenSOC's code.
Here is the error:
java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2106) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.security.Groups.<init>(Groups.java:70) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.security.Groups.<init>(Groups.java:66) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:280) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:271) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:248) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:763) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:748) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:621) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.7.0_67]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[na:1.7.0_67]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.7.0_67]
at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_67]
at org.apache.hadoop.hbase.util.Methods.call(Methods.java:39) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.hbase.security.User.call(User.java:431) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.hbase.security.User.callStatic(User.java:421) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.hbase.security.User.access$200(User.java:49) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.hbase.security.User$SecureHadoopUser.<init>(User.java:241) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.hbase.security.User$SecureHadoopUser.<init>(User.java:236) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.hbase.security.User.getCurrent(User.java:159) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.hbase.security.UserProvider.getCurrent(UserProvider.java:86) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.hbase.client.HConnectionKey.<init>(HConnectionKey.java:70) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:271) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:197) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:159) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at com.opensoc.hbase.HTableConnector.<init>(HTableConnector.java:51) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at com.opensoc.hbase.HBaseBolt.prepare(HBaseBolt.java:58) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at backtype.storm.daemon.executor$fn__3907$fn__3920.invoke(executor.clj:732) ~[storm-core-0.9.3.2.2.6.0-2800.jar:0.9.3.2.2.6.0-2800]
at backtype.storm.util$async_loop$fn__451.invoke(util.clj:463) ~[storm-core-0.9.3.2.2.6.0-2800.jar:0.9.3.2.2.6.0-2800]
at clojure.lang.AFn.run(AFn.java:24) [clojure-1.5.1.jar:na]
at java.lang.Thread.run(Thread.java:745) [na:1.7.0_67]
I've already tried several solutions but none of them worked.
As far I understand, the problem arises due to the missing "hadoop-common"-library in the classpath. Therefore, I added the related path to "java.library.path" of storm. But the problem remained.
I also tried to add an explicit dependency to pom.xml of OpenSOC-Commons:
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.1</version>
</dependency>
But this leads to another error.
Caused by: java.lang.RuntimeException: Socket Factory class not found: java.lang.ClassNotFoundException: org.apache.hadoop.net.StandardSocketFactory
at org.apache.hadoop.net.NetUtils.getSocketFactoryFromProperty(NetUtils.java:120) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.net.NetUtils.getDefaultSocketFactory(NetUtils.java:100) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.net.NetUtils.getSocketFactory(NetUtils.java:80) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:263) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:245) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:100) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1446) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1464) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:263) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:187) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.hbase.util.DynamicClassLoader.<init>(DynamicClassLoader.java:104) ~[OpenSOC-Topologies-0.5BETA.jar:na]
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>(ProtobufUtil.java:202) ~[OpenSOC-Topologies-0.5BETA.jar:na]
... 20 common frames omitted
Interestingly, the StandardSocketFactory-class is also part of the "hadoop-core"-library and therefore this error should not occur if the first one is really solved by the added dependency. But it does occur. :-|
Then I tried a dirty hack and copied the "hadoop-common.jar" to the lib-directory of storm (after removing the dependency from the pom.xml) but this didn't help either and the first error occured again.
All in all, although it seems that there is a problem with unresolved depedencies, all my tries to solve this issue failed and I have a feeling that I'm missing something important. Maybe some of you, have an idea what the real problem could be?
Thanks,
Sergej