Failed to detect a valid hadoop home directory

5,478 views
Skip to first unread message

Fredrik

unread,
May 3, 2016, 5:36:01 AM5/3/16
to CDAP User
Hello,

I have installed CDAP 3.4.0-1 on CDH 5.5.2/CM 5.7. I saw the following exceptions in the master.log and router.log:

2016-05-03 06:14:05,519 DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
2016-05-03 06:14:05,760 DEBUG org.apache.hadoop.security.Groups:  Creating new Groups object
2016-05-03 06:14:05,957 DEBUG org.apache.hadoop.util.Shell: Failed to detect a valid hadoop home directory
java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
    at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:325) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.hadoop.util.Shell.<clinit>(Shell.java:350) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:130) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.hadoop.security.Groups.<init>(Groups.java:94) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.hadoop.security.Groups.<init>(Groups.java:74) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:303) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:283) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.hadoop.security.UserGroupInformation.isAuthenticationMethodEnabled(UserGroupInformation.java:337) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:331) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at co.cask.cdap.common.kerberos.SecurityUtil.loginForMasterService(SecurityUtil.java:148) [co.cask.cdap.cdap-common-3.4.0.jar:na]
    at co.cask.cdap.master.startup.MasterStartupTool.main(MasterStartupTool.java:75) [co.cask.cdap.cdap-master-3.4.0.jar:na]
2016-05-03 06:14:06,046 DEBUG org.apache.hadoop.util.Shell: setsid exited with exit code 0
2016-05-03 06:14:06,070 DEBUG org.apache.hadoop.security.Groups: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; cacheTimeout=300000; warningDeltaMs=5000
2016-05-03 06:14:06,372 DEBUG co.cask.cdap.master.startup.MasterStartupTool: Adding startup checks from package co.cask.cdap.master.startup
2016-05-03 06:14:07,090 INFO co.cask.cdap.common.guice.LocationRuntimeModule: HDFS namespace is /cdap
2016-05-03 06:14:07,091 DEBUG co.cask.cdap.common.guice.LocationRuntimeModule: Getting filesystem for user cdap
2016-05-03 06:14:07,094 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:cdap (auth:SIMPLE) from:co.cask.cdap.common.guice.LocationRuntimeModule$HDFSLocationModule.providesLocationFactory(LocationRuntimeModule.java:123)
2016-05-03 06:14:07,105 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:cdap (auth:SIMPLE) from:org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:334)
2016-05-03 06:14:07,306 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
2016-05-03 06:14:07,307 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
2016-05-03 06:14:07,308 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
2016-05-03 06:14:07,309 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.domain.socket.path = /storage/run/hdfs-sockets/dn
2016-05-03 06:14:07,386 DEBUG org.apache.hadoop.io.retry.RetryUtils: multipleLinearRandomRetry = null
2016-05-03 06:14:07,415 DEBUG org.apache.hadoop.ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@29e6eb25
2016-05-03 06:14:07,434 DEBUG org.apache.hadoop.ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@7e3060d8
2016-05-03 06:14:07,885 DEBUG org.apache.hadoop.util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
2016-05-03 06:14:07,886 DEBUG org.apache.hadoop.util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
2016-05-03 06:14:07,887 DEBUG org.apache.hadoop.util.NativeCodeLoader: java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
2016-05-03 06:14:07,887 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016-05-03 06:14:07,898 DEBUG org.apache.hadoop.util.PerformanceAdvisory: Both short-circuit local reads and UNIX domain socket are disabled.
2016-05-03 06:14:07,904 DEBUG org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection
2016-05-03 06:14:07,909 DEBUG org.apache.hadoop.crypto.OpensslCipher: Failed to load OpenSSL Cipher.
java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl()Z
    at org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl(Native Method) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:84) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec.<init>(OpensslAesCtrCryptoCodec.java:50) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) [na:1.8.0_77]
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) [na:1.8.0_77]
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) [na:1.8.0_77]
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423) [na:1.8.0_77]
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:68) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:101) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.hadoop.fs.Hdfs.<init>(Hdfs.java:91) [hadoop-hdfs-2.6.0-cdh5.5.2.jar:na]
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) [na:1.8.0_77]
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) [na:1.8.0_77]
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) [na:1.8.0_77]
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423) [na:1.8.0_77]
    at org.apache.hadoop.fs.AbstractFileSystem.newInstance(AbstractFileSystem.java:129) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:157) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:242) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:337) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:334) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at java.security.AccessController.doPrivileged(Native Method) [na:1.8.0_77]
    at javax.security.auth.Subject.doAs(Subject.java:422) [na:1.8.0_77]
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:334) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:451) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:473) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at org.apache.twill.filesystem.FileContextLocationFactory.createFileContext(FileContextLocationFactory.java:130) [co.cask.cdap.cdap-common-3.4.0.jar:0.7.0-incubating]
    at org.apache.twill.filesystem.FileContextLocationFactory.<init>(FileContextLocationFactory.java:56) [co.cask.cdap.cdap-common-3.4.0.jar:0.7.0-incubating]
    at co.cask.cdap.common.guice.LocationRuntimeModule$HDFSLocationModule$1.run(LocationRuntimeModule.java:126) [co.cask.cdap.cdap-common-3.4.0.jar:na]
    at co.cask.cdap.common.guice.LocationRuntimeModule$HDFSLocationModule$1.run(LocationRuntimeModule.java:123) [co.cask.cdap.cdap-common-3.4.0.jar:na]
    at java.security.AccessController.doPrivileged(Native Method) [na:1.8.0_77]
    at javax.security.auth.Subject.doAs(Subject.java:360) [na:1.8.0_77]
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1651) [hadoop-common-2.6.0-cdh5.5.2.jar:na]
    at co.cask.cdap.common.guice.LocationRuntimeModule$HDFSLocationModule.providesLocationFactory(LocationRuntimeModule.java:123) [co.cask.cdap.cdap-common-3.4.0.jar:na]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_77]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_77]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_77]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_77]
    at com.google.inject.internal.ProviderMethod.get(ProviderMethod.java:104) [com.google.inject.guice-3.0.jar:na]
    at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:40) [com.google.inject.guice-3.0.jar:na]
    at com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46) [com.google.inject.guice-3.0.jar:na]
    at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1031) [com.google.inject.guice-3.0.jar:na]
    at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40) [com.google.inject.guice-3.0.jar:na]
    at com.google.inject.Scopes$1$1.get(Scopes.java:65) [com.google.inject.guice-3.0.jar:na]
    at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:40) [com.google.inject.guice-3.0.jar:na]
    at com.google.inject.internal.ExposedKeyFactory.get(ExposedKeyFactory.java:54) [com.google.inject.guice-3.0.jar:na]
    at com.google.inject.internal.SingleParameterInjector.inject(SingleParameterInjector.java:38) [com.google.inject.guice-3.0.jar:na]
    at com.google.inject.internal.SingleParameterInjector.getAll(SingleParameterInjector.java:62) [com.google.inject.guice-3.0.jar:na]
    at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:84) [com.google.inject.guice-3.0.jar:na]
    at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:254) [com.google.inject.guice-3.0.jar:na]
    at com.google.inject.internal.InjectorImpl$4$1.call(InjectorImpl.java:978) [com.google.inject.guice-3.0.jar:na]
    at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1024) [com.google.inject.guice-3.0.jar:na]
    at com.google.inject.internal.InjectorImpl$4.get(InjectorImpl.java:974) [com.google.inject.guice-3.0.jar:na]
    at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1013) [com.google.inject.guice-3.0.jar:na]
    at co.cask.cdap.common.startup.CheckRunner$Builder.addChecksInPackage(CheckRunner.java:99) [co.cask.cdap.cdap-common-3.4.0.jar:na]
    at co.cask.cdap.master.startup.MasterStartupTool.createCheckRunner(MasterStartupTool.java:119) [co.cask.cdap.cdap-master-3.4.0.jar:na]
    at co.cask.cdap.master.startup.MasterStartupTool.<init>(MasterStartupTool.java:90) [co.cask.cdap.cdap-master-3.4.0.jar:na]
    at co.cask.cdap.master.startup.MasterStartupTool.main(MasterStartupTool.java:83) [co.cask.cdap.cdap-master-3.4.0.jar:na]
2016-05-03 06:14:07,910 DEBUG org.apache.hadoop.util.PerformanceAdvisory: Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
2016-05-03 06:14:07,910 DEBUG org.apache.hadoop.util.PerformanceAdvisory: Using crypto codec org.apache.hadoop.crypto.JceAesCtrCryptoCodec.
2016-05-03 06:14:07,914 INFO co.cask.cdap.master.startup.ConfigurationCheck: Checking that config settings are valid.
2016-05-03 06:14:08,125 INFO co.cask.cdap.master.startup.ConfigurationCheck:   Configuration successfully verified.
2016-05-03 06:14:08,125 INFO co.cask.cdap.master.startup.HBaseCheck: Checking HBase availability.
2016-05-03 06:14:08,163 DEBUG org.apache.hadoop.security.UserGroupInformation: hadoop login
2016-05-03 06:14:08,164 DEBUG org.apache.hadoop.security.UserGroupInformation: hadoop login commit
2016-05-03 06:14:08,168 DEBUG org.apache.hadoop.security.UserGroupInformation: using local user:UnixPrincipal: cdap
2016-05-03 06:14:08,169 DEBUG org.apache.hadoop.security.UserGroupInformation: Using user: "UnixPrincipal: cdap" with name cdap
2016-05-03 06:14:08,169 DEBUG org.apache.hadoop.security.UserGroupInformation: User entry: "cdap"
2016-05-03 06:14:08,170 DEBUG org.apache.hadoop.security.UserGroupInformation: UGI loginUser:cdap (auth:SIMPLE)
2016-05-03 06:14:08,213 TRACE org.apache.hadoop.hbase.client.RetryingCallerInterceptorFactory: Using NoOpRetryableCallerInterceptor for intercepting the RpcRetryingCaller


Could you let me know how to fix the above 2 exceptions if I may manually do it?
1. Failed to detect a valid hadoop home directory
2.
Failed to load OpenSSL Cipher
And if the build-in java lib is launched instead of the native hadoop lib, what's the impact to the CDAP application developed based on Hadoop?

Thanks.
Best Regards,
Fred

Ali Anwar

unread,
May 3, 2016, 2:51:41 PM5/3/16
to Fredrik, CDAP User
Hey Fred.

1. This is a log level DEBUG, and it is not a critical message; it can be ignored.
2. It seems you don't have the OpenSSL Cipher package installed/configured. Can you share what you are trying to do (i.e. use HDFS encryption?). Note that this is also at log level DEBUG. Is CDAP otherwise functioning properly? Are there any ERROR-level messages?

Regards,

Ali Anwar

--
You received this message because you are subscribed to the Google Groups "CDAP User" group.
To unsubscribe from this group and stop receiving emails from it, send an email to cdap-user+...@googlegroups.com.
To post to this group, send email to cdap...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/cdap-user/cbfff6c1-27ea-4f24-a3de-3a5a392aeedc%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Fredrik

unread,
May 3, 2016, 10:38:56 PM5/3/16
to CDAP User, frederic...@gmail.com
Hello Ali,

I have posted another topic about the server startup issue ("UI and Master process down - Discoverable endpoint appfabric not found").
For the item 1, on the CDH environment, what's the HADOOP_HOME? What's difference between native hadooop lib and built-in java impl. for the CDAP application which will access file in HDFS?
For the item 2, currently I do not have the OpenSSL requirement. So I will ignore this.

Thank you.
Best Regards, Fred

vin...@cask.co

unread,
May 9, 2016, 4:00:07 PM5/9/16
to CDAP User, frederic...@gmail.com
Hey Fredrik,

For item 1, on CDH 5.5.2, as a sample value, HADOOP_HOME would be something like /opt/cloudera/parcels/CDH-5.5.2-1.cdh5.5.2.p0.4/lib/hadoop. The difference between native hadoop lib and built-in java impl can be found here

Regards,
Vinisha

Fredrik

unread,
May 9, 2016, 9:06:43 PM5/9/16
to CDAP User, frederic...@gmail.com
Hello Vinisha,

Thanks for the useful information. I set the the 'Cask DAP Service Environment Advanced Configuration Snippet (Safety Valve)' with HADOOP_HOME env variable, then the log did not prompt me with the HADOOP_HOME not set warning. However, the native-hadoop lib was unable to be loaded. Is there extra steps I need to configure?

2016-05-10 00:55:22,522 INFO co.cask.cdap.common.startup.ConfigurationLogger: Important config settings:
2016-05-10 00:55:22,614 INFO co.cask.cdap.common.startup.ConfigurationLogger:   ssl.enabled: false
2016-05-10 00:55:22,614 INFO co.cask.cdap.common.startup.ConfigurationLogger:   security.enabled: false
2016-05-10 00:55:22,615 INFO co.cask.cdap.common.startup.ConfigurationLogger:   explore.enabled: true
2016-05-10 00:55:22,615 INFO co.cask.cdap.common.startup.ConfigurationLogger:   router.bind.port: 11015
2016-05-10 00:55:22,615 INFO co.cask.cdap.common.startup.ConfigurationLogger:   router.ssl.bind.port: 10443
2016-05-10 00:55:22,616 INFO co.cask.cdap.common.startup.ConfigurationLogger:   dashboard.bind.port: 9999
2016-05-10 00:55:22,616 INFO co.cask.cdap.common.startup.ConfigurationLogger:   dashboard.ssl.bind.port: 9443
2016-05-10 00:55:22,616 INFO co.cask.cdap.common.startup.ConfigurationLogger:   security.auth.server.bind.port: 10009
2016-05-10 00:55:22,617 INFO co.cask.cdap.common.startup.ConfigurationLogger:   security.auth.server.ssl.bind.port: 10010
2016-05-10 00:55:22,617 INFO co.cask.cdap.master.startup.MasterStartupTool: Hadoop subsystem versions:
2016-05-10 00:55:22,624 INFO co.cask.cdap.master.startup.MasterStartupTool:   Hadoop version: 2.6.0-cdh5.5.2
2016-05-10 00:55:22,637 INFO co.cask.cdap.master.startup.MasterStartupTool:   HBase version: 1.0.0-cdh5.5.2
2016-05-10 00:55:22,639 INFO co.cask.cdap.master.startup.MasterStartupTool:   ZooKeeper version: 3.4.5.1405704
2016-05-10 00:55:22,641 INFO co.cask.cdap.master.startup.MasterStartupTool:   Kafka version: 0.8.2.2
2016-05-10 00:55:22,761 INFO co.cask.cdap.master.startup.MasterStartupTool:   Hive version: 1.1.0-cdh5.5.2
2016-05-10 00:55:22,764 INFO co.cask.cdap.master.startup.MasterStartupTool: CDAP version: 3.4.0-1461970648992
2016-05-10 00:55:22,764 INFO co.cask.cdap.master.startup.MasterStartupTool: CDAP HBase compat version: HBASE_10_CDH55
2016-05-10 00:55:22,766 INFO co.cask.cdap.master.startup.MasterStartupTool: Tephra HBase compat version: HBASE_10_CDH
2016-05-10 00:55:24,018 INFO co.cask.cdap.common.guice.LocationRuntimeModule: HDFS namespace is /cdap
2016-05-10 00:55:24,712 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016-05-10 00:55:24,738 INFO co.cask.cdap.master.startup.FileSystemCheck: Checking FileSystem availability.
2016-05-10 00:55:24,841 INFO co.cask.cdap.master.startup.FileSystemCheck:   FileSystem availability successfully verified.
2016-05-10 00:55:24,843 INFO co.cask.cdap.master.startup.FileSystemCheck: Checking that user cdap has permission to write to /cdap on the FileSystem.
2016-05-10 00:55:24,922 INFO co.cask.cdap.master.startup.FileSystemCheck:   FileSystem permissions successfully verified.
2016-05-10 00:55:24,923 INFO co.cask.cdap.master.startup.HBaseCheck: Checking HBase availability.
2016-05-10 00:55:25,623 INFO co.cask.cdap.master.startup.HBaseCheck:   HBase availability successfully verified.
2016-05-10 00:55:25,730 INFO co.cask.cdap.master.startup.ConfigurationCheck: Checking that config settings are valid.
2016-05-10 00:55:25,913 INFO co.cask.cdap.master.startup.ConfigurationCheck:   Configuration successfully verified.
2016-05-10 00:55:25,914 INFO co.cask.cdap.master.startup.YarnCheck: Checking YARN availability -- may take up to 60 seconds.
2016-05-10 00:55:26,170 INFO co.cask.cdap.master.startup.YarnCheck:   YARN availability successfully verified.
2016-05-10 00:55:26,170 INFO co.cask.cdap.master.startup.YarnCheck: Checking that YARN has enough resources to run all system services.
2016-05-10 00:55:26,171 INFO co.cask.cdap.master.startup.YarnCheck:   YARN resources successfully verified.
2016-05-10 00:55:27,697 INFO co.cask.cdap.data.runtime.main.MasterServiceMain: Starting MasterServiceMain
2016-05-10 00:55:28,943 INFO co.cask.cdap.common.guice.LocationRuntimeModule: HDFS namespace is /cdap
2016-05-10 00:55:29,615 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016-05-10 00:55:29,742 INFO co.cask.cdap.security.TokenSecureStoreUpdater: Setting token renewal time to: 86100000 ms
2016-05-10 00:55:29,777 INFO co.cask.cdap.common.io.URLConnections: Turning off default caching in URLConnection
2016-05-10 00:55:30,669 INFO co.cask.cdap.data2.util.hbase.ConfigurationTable: Writing new config row with key DEFAULT
2016-05-10 00:55:30,679 INFO co.cask.cdap.data2.util.hbase.ConfigurationTable: Deleting any configuration from 1462841730667 or before
master.log

vin...@cask.co

unread,
May 10, 2016, 3:47:42 PM5/10/16
to CDAP User, frederic...@gmail.com
Hey Frederic,

This warning message should not affect any of the CDAP functionalities. I also checked master logs. Looks like all the services are up and running. So, for now, this warning message can be ignored. 

We have filed JIRA to resolve this warning message. Please follow https://issues.cask.co/browse/CDAP-5959 link for more information. 


Thanks,
Vinisha

Fredrik

unread,
May 10, 2016, 8:29:28 PM5/10/16
to CDAP User, frederic...@gmail.com
Thank you Vinisha : )

Best regards,
Frederic
Reply all
Reply to author
Forward
0 new messages