Hello,
I have installed CDAP 3.4.0-1 on CDH 5.5.2/CM 5.7. I saw the following exceptions in the master.log and router.log:
2016-05-03 06:14:05,519 DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
2016-05-03 06:14:05,760 DEBUG org.apache.hadoop.security.Groups: Creating new Groups object
2016-05-03 06:14:05,957 DEBUG org.apache.hadoop.util.Shell: Failed to detect a valid hadoop home directory
java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:325) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:350) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:130) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.hadoop.security.Groups.<init>(Groups.java:94) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.hadoop.security.Groups.<init>(Groups.java:74) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:303) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:283) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.hadoop.security.UserGroupInformation.isAuthenticationMethodEnabled(UserGroupInformation.java:337) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:331) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at co.cask.cdap.common.kerberos.SecurityUtil.loginForMasterService(SecurityUtil.java:148) [co.cask.cdap.cdap-common-3.4.0.jar:na]
at co.cask.cdap.master.startup.MasterStartupTool.main(MasterStartupTool.java:75) [co.cask.cdap.cdap-master-3.4.0.jar:na]
2016-05-03 06:14:06,046 DEBUG org.apache.hadoop.util.Shell: setsid exited with exit code 0
2016-05-03 06:14:06,070 DEBUG org.apache.hadoop.security.Groups: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; cacheTimeout=300000; warningDeltaMs=5000
2016-05-03 06:14:06,372 DEBUG co.cask.cdap.master.startup.MasterStartupTool: Adding startup checks from package co.cask.cdap.master.startup
2016-05-03 06:14:07,090 INFO co.cask.cdap.common.guice.LocationRuntimeModule: HDFS namespace is /cdap
2016-05-03 06:14:07,091 DEBUG co.cask.cdap.common.guice.LocationRuntimeModule: Getting filesystem for user cdap
2016-05-03 06:14:07,094 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:cdap (auth:SIMPLE) from:co.cask.cdap.common.guice.LocationRuntimeModule$HDFSLocationModule.providesLocationFactory(LocationRuntimeModule.java:123)
2016-05-03 06:14:07,105 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:cdap (auth:SIMPLE) from:org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:334)
2016-05-03 06:14:07,306 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
2016-05-03 06:14:07,307 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
2016-05-03 06:14:07,308 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
2016-05-03 06:14:07,309 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.domain.socket.path = /storage/run/hdfs-sockets/dn
2016-05-03 06:14:07,386 DEBUG org.apache.hadoop.io.retry.RetryUtils: multipleLinearRandomRetry = null
2016-05-03 06:14:07,415 DEBUG org.apache.hadoop.ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@29e6eb25
2016-05-03 06:14:07,434 DEBUG org.apache.hadoop.ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@7e3060d8
2016-05-03 06:14:07,885 DEBUG org.apache.hadoop.util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
2016-05-03 06:14:07,886 DEBUG org.apache.hadoop.util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
2016-05-03 06:14:07,887 DEBUG org.apache.hadoop.util.NativeCodeLoader: java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
2016-05-03 06:14:07,887 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016-05-03 06:14:07,898 DEBUG org.apache.hadoop.util.PerformanceAdvisory: Both short-circuit local reads and UNIX domain socket are disabled.
2016-05-03 06:14:07,904 DEBUG org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection
2016-05-03 06:14:07,909 DEBUG org.apache.hadoop.crypto.OpensslCipher: Failed to load OpenSSL Cipher.
java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl()Z
at org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl(Native Method) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:84) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec.<init>(OpensslAesCtrCryptoCodec.java:50) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) [na:1.8.0_77]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) [na:1.8.0_77]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) [na:1.8.0_77]
at java.lang.reflect.Constructor.newInstance(Constructor.java:423) [na:1.8.0_77]
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:68) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:101) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.hadoop.fs.Hdfs.<init>(Hdfs.java:91) [hadoop-hdfs-2.6.0-cdh5.5.2.jar:na]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) [na:1.8.0_77]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) [na:1.8.0_77]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) [na:1.8.0_77]
at java.lang.reflect.Constructor.newInstance(Constructor.java:423) [na:1.8.0_77]
at org.apache.hadoop.fs.AbstractFileSystem.newInstance(AbstractFileSystem.java:129) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:157) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:242) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:337) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:334) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at java.security.AccessController.doPrivileged(Native Method) [na:1.8.0_77]
at javax.security.auth.Subject.doAs(Subject.java:422) [na:1.8.0_77]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:334) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:451) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:473) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at org.apache.twill.filesystem.FileContextLocationFactory.createFileContext(FileContextLocationFactory.java:130) [co.cask.cdap.cdap-common-3.4.0.jar:0.7.0-incubating]
at org.apache.twill.filesystem.FileContextLocationFactory.<init>(FileContextLocationFactory.java:56) [co.cask.cdap.cdap-common-3.4.0.jar:0.7.0-incubating]
at co.cask.cdap.common.guice.LocationRuntimeModule$HDFSLocationModule$1.run(LocationRuntimeModule.java:126) [co.cask.cdap.cdap-common-3.4.0.jar:na]
at co.cask.cdap.common.guice.LocationRuntimeModule$HDFSLocationModule$1.run(LocationRuntimeModule.java:123) [co.cask.cdap.cdap-common-3.4.0.jar:na]
at java.security.AccessController.doPrivileged(Native Method) [na:1.8.0_77]
at javax.security.auth.Subject.doAs(Subject.java:360) [na:1.8.0_77]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1651) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
at co.cask.cdap.common.guice.LocationRuntimeModule$HDFSLocationModule.providesLocationFactory(LocationRuntimeModule.java:123) [co.cask.cdap.cdap-common-3.4.0.jar:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_77]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_77]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_77]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_77]
at com.google.inject.internal.ProviderMethod.get(ProviderMethod.java:104) [com.google.inject.guice-3.0.jar:na]
at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:40) [com.google.inject.guice-3.0.jar:na]
at com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46) [com.google.inject.guice-3.0.jar:na]
at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1031) [com.google.inject.guice-3.0.jar:na]
at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40) [com.google.inject.guice-3.0.jar:na]
at com.google.inject.Scopes$1$1.get(Scopes.java:65) [com.google.inject.guice-3.0.jar:na]
at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:40) [com.google.inject.guice-3.0.jar:na]
at com.google.inject.internal.ExposedKeyFactory.get(ExposedKeyFactory.java:54) [com.google.inject.guice-3.0.jar:na]
at com.google.inject.internal.SingleParameterInjector.inject(SingleParameterInjector.java:38) [com.google.inject.guice-3.0.jar:na]
at com.google.inject.internal.SingleParameterInjector.getAll(SingleParameterInjector.java:62) [com.google.inject.guice-3.0.jar:na]
at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:84) [com.google.inject.guice-3.0.jar:na]
at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:254) [com.google.inject.guice-3.0.jar:na]
at com.google.inject.internal.InjectorImpl$4$1.call(InjectorImpl.java:978) [com.google.inject.guice-3.0.jar:na]
at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1024) [com.google.inject.guice-3.0.jar:na]
at com.google.inject.internal.InjectorImpl$4.get(InjectorImpl.java:974) [com.google.inject.guice-3.0.jar:na]
at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1013) [com.google.inject.guice-3.0.jar:na]
at co.cask.cdap.common.startup.CheckRunner$Builder.addChecksInPackage(CheckRunner.java:99) [co.cask.cdap.cdap-common-3.4.0.jar:na]
at co.cask.cdap.master.startup.MasterStartupTool.createCheckRunner(MasterStartupTool.java:119) [co.cask.cdap.cdap-master-3.4.0.jar:na]
at co.cask.cdap.master.startup.MasterStartupTool.<init>(MasterStartupTool.java:90) [co.cask.cdap.cdap-master-3.4.0.jar:na]
at co.cask.cdap.master.startup.MasterStartupTool.main(MasterStartupTool.java:83) [co.cask.cdap.cdap-master-3.4.0.jar:na]
2016-05-03 06:14:07,910 DEBUG org.apache.hadoop.util.PerformanceAdvisory: Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
2016-05-03 06:14:07,910 DEBUG org.apache.hadoop.util.PerformanceAdvisory: Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
2016-05-03 06:14:07,914 INFO co.cask.cdap.master.startup.ConfigurationCheck: Checking that config settings are valid.
2016-05-03 06:14:08,125 INFO co.cask.cdap.master.startup.ConfigurationCheck: Configuration successfully verified.
2016-05-03 06:14:08,125 INFO co.cask.cdap.master.startup.HBaseCheck: Checking HBase availability.
2016-05-03 06:14:08,163 DEBUG org.apache.hadoop.security.UserGroupInformation: hadoop login
2016-05-03 06:14:08,164 DEBUG org.apache.hadoop.security.UserGroupInformation: hadoop login commit
2016-05-03 06:14:08,168 DEBUG org.apache.hadoop.security.UserGroupInformation: using local user:UnixPrincipal: cdap
2016-05-03 06:14:08,169 DEBUG org.apache.hadoop.security.UserGroupInformation: Using user: "UnixPrincipal: cdap" with name cdap
2016-05-03 06:14:08,169 DEBUG org.apache.hadoop.security.UserGroupInformation: User entry: "cdap"
2016-05-03 06:14:08,170 DEBUG org.apache.hadoop.security.UserGroupInformation: UGI loginUser:cdap (auth:SIMPLE)
2016-05-03 06:14:08,213 TRACE org.apache.hadoop.hbase.client.RetryingCallerInterceptorFactory: Using NoOpRetryableCallerInterceptor for intercepting the RpcRetryingCaller
Could you let me know how to fix the above 2 exceptions if I may manually do it?
1. Failed to detect a valid hadoop home directory
2. Failed to load OpenSSL Cipher
And if the build-in java lib is launched instead of the native hadoop lib, what's the impact to the CDAP application developed based on Hadoop?
Thanks.
Best Regards,
Fred