CDAP 4.3.4 - Hive explorer disabled in UI (eye icon)

40 views
Skip to first unread message

Omar Meza

unread,
Aug 6, 2018, 3:56:28 PM8/6/18
to CDAP User
Installed CDAP 4.3.4 on HDP 2.5.3... having issues with the data exploration. It is Disabled in the UI. So, cannot run hive queries against any dataset .... also, hive tables are not been created.

Explorer.enabled=true

cdap user has access to read/create hive databases/tables.

Any advice?

Thank you!
Omar

Ali Anwar

unread,
Aug 6, 2018, 4:12:52 PM8/6/18
to cdap...@googlegroups.com
Hi Omar.

The property in cdap-site.xml that controls this is 'explore.enabled'.
Can you check what the value of this is in the cdap-site.xml? Note that it is different than the 'explorer.enabled' that you mentioned.

Also, when creating a dataset that would create a hive table, are there are error/warn logs in the cdap logs at the same time?

Regards,
Ali Anwar

--
You received this message because you are subscribed to the Google Groups "CDAP User" group.
To unsubscribe from this group and stop receiving emails from it, send an email to cdap-user+...@googlegroups.com.
To post to this group, send email to cdap...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/cdap-user/774d288b-a40b-47a3-b669-47dadf78cd45%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Omar Meza

unread,
Aug 6, 2018, 5:00:30 PM8/6/18
to CDAP User
Thank you for the quick response!

Yes, I meant explore.enabled = true. Following added some of my test results:

cdap cli

execute "select * from RefDataset''

Error: co.cask.cdap.explore.service.ExploreException: Cannot execute query. Reason: Response code: 400, message: 'Bad Request', body: '[SQLState 42S02] Error while compiling statement: FAILED: SemanticException [Error 10001]: Line 1:14 Table not found 'RefDataset''

Enabling Explore:

POST /v3/namespaces/<namespace-id>/data/explore/datasets/<dataset-name>/enable

SQL exception while trying to enable explore on dataset dataset:defaultRefDataset

Pipeline Logs:

2018-08-06 15:19:14,987 - WARN [main:o.a.h.h.s.DomainSocketFactory@117] - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2018-08-06 15:19:15,521 - WARN [main:o.a.h.m.i.MetricsConfig@125] - Cannot locate configuration: tried hadoop-metrics2-mrappmaster.properties,hadoop-metrics2.properties
2018-08-06 15:19:15,911 - WARN [main:o.a.h.m.v.a.MRAppMaster@126] - 2018-08-06 15:19:15.908:INFO::Logging to STDERR via org.mortbay.log.StdErrLog
2018-08-06 15:19:15,933 - WARN [main:o.a.h.h.HttpRequestLog@100] - Jetty request log can only be enabled using Log4j
2018-08-06 15:19:16,081 - WARN [main:o.a.h.m.v.a.MRAppMaster@126] - 2018-08-06 15:19:16.081:INFO::jetty-6.1.26.hwx
2018-08-06 15:19:16,114 - WARN [main:o.a.h.m.v.a.MRAppMaster@126] - 2018-08-06 15:19:16.114:INFO::Extract jar:file:/hadoop/data09/hadoop/yarn/local/filecache/7857/mapreduce.tar.gz/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.7.3.2.5.3.0-37.jar!/webapps/mapreduce to /hadoop/data04/hadoop/yarn/local/usercache/yarn/appcache/application_1532173678258_30111/container_e41_1532173678258_30111_01_000001/tmp/Jetty_0_0_0_0_42379_mapreduce____o3napu/webapp
2018-08-06 15:19:18,079 - WARN [main:o.a.h.m.v.a.MRAppMaster@126] - 2018-08-06 15:19:18.078:INFO::Started HttpServer2$SelectChannelConne...@0.0.0.0:42379
2018-08-06 15:19:38,296 - WARN [main:o.a.h.m.i.MetricsConfig@125] - Cannot locate configuration: tried hadoop-metrics2-maptask.properties,hadoop-metrics2.properties
2018-08-06 15:19:43,012 - WARN [main:c.c.w.Wrangler@303] - Stage:Wrangler Preventive Proc - Context is set to default, no aliasing and restriction would be applied.
2018-08-06 15:19:49,071 - WARN [Thread-78:o.a.h.m.v.a.MRAppMaster@126] - 2018-08-06 15:19:49.070:INFO::Stopped HttpServer2$SelectChannelConne...@0.0.0.0:0

Thank you!
Omar

Ali Anwar

unread,
Aug 6, 2018, 5:05:57 PM8/6/18
to cdap...@googlegroups.com
Hi Omar.

Can you attach the master logs for the time range when the pipeline is deployed?
When deploying, please do name the dataset by a name that does not already exist.

Regards,
Ali Anwar

--
You received this message because you are subscribed to the Google Groups "CDAP User" group.
To unsubscribe from this group and stop receiving emails from it, send an email to cdap-user+...@googlegroups.com.
To post to this group, send email to cdap...@googlegroups.com.

Andreas Neumann

unread,
Aug 6, 2018, 5:11:23 PM8/6/18
to cdap...@googlegroups.com
Note that if your dataset name is RefDataset, the corrsponding explore/Hive table is named "dataset_RefDataset". Can you try that?
-Andreas


Omar Meza

unread,
Aug 6, 2018, 5:14:53 PM8/6/18
to CDAP User
There is nothing on the master logs (master-cdap-*). I tried with the “Data Preparation” but got the same results

Seems like cdap user is unable to create the Hive tables.

Select * From undefined.undefined limit 500

[SQLState 42S02] Error while compiling statement: FAILED: SenanticException [Error 10001]: Line 1:14 Table not found ‘undefined’

Thank you!
Omar

Omar Meza

unread,
Aug 6, 2018, 5:42:43 PM8/6/18
to CDAP User
Error found:

router-cdap-* log…

Is the following related to the hive issue?

2018-08-06 16:36:07,938 - ERROR [zk-client-EventThread:c.c.c.s.z.SharedResourceCache$1$1@122] - Failed to get data for child node 153904491

org.apache.zookeeper.KeeperException$NoNodeException: KeeperErrorCode = NoNode for /keys/153904491

at org.apache.zookeeper.KeeperException.create(KeeperException.java:111) ~[zookeeper-3.4.6.2.5.3.0-37.jar:3.4.6-37--1]

at org.apache.zookeeper.KeeperException.create(KeeperException.java:51) ~[zookeeper-3.4.6.2.5.3.0-37.jar:3.4.6-37--1]

at org.apache.twill.internal.zookeeper.DefaultZKClientService$Callbacks$5.processResult(DefaultZKClientService.java:633) ~[org.apache.twill.twill-zookeeper-0.12.1.jar:0.12.1]

at org.apache.zookeeper.ClientCnxn$EventThread.processEvent(ClientCnxn.java:576) ~[zookeeper-3.4.6.2.5.3.0-37.jar:3.4.6-37--1]

at org.apache.zookeeper.ClientCnxn$EventThread.run(ClientCnxn.java:510) ~[zookeeper-3.4.6.2.5.3.0-37.jar:3.4.6-37--1]

2018-08-06 16:36:07,939 - INFO [zk-client-EventThread:c.c.c.s.z.SharedResourceCache$1@102] - Listing existing children for node /keys

Thank you!
Omar

Omar Meza

unread,
Aug 6, 2018, 5:49:48 PM8/6/18
to CDAP User
Getting following error log from explore service.

How can I fix this Authorization failed: No privilege ‘Create’ found.


16:39:39,265 - INFO [explore.service-executor-15:h.q.p.ParseDriver@185] - Parsing command: CREATE EXTERNAL TABLE IF NOT EXISTS dataset_proccodetest (proccode string, modifier string, conditionclass string, diagnosissubset string, coreclass string) COMMENT 'CDAP Dataset' STORED BY 'co.cask.cdap.hive.datasets.DatasetStorageHandler' WITH SERDEPROPERTIES ('explore.dataset.name'='proccodetest', 'explore.dataset.namespace'='default') TBLPROPERTIES ('cdap.name'='proccodetest', 'cdap.version'='4.3.4-1522017174177')

2018-08-06 16:39:39,266 - INFO [explore.service-executor-15:h.q.p.ParseDriver@209] - Parse Completed

2018-08-06 16:39:39,295 - WARN [explore.service-executor-15:E.stderr@126] - Authorization failed:No privilege 'Create' found for outputs { database:default}. Use SHOW GRANT to get more details.

2018-08-06 16:39:39,296 - ERROR [explore.service-executor-15:o.a.h.h.q.Driver@989] - Authorization failed:No privilege 'Create' found for outputs { database:default}. Use SHOW GRANT to get more details.


Thank you!!
Omar

Ali Anwar

unread,
Aug 6, 2018, 7:07:47 PM8/6/18
to cdap...@googlegroups.com
Hi Omar.

The cdap user doesn't seem to have create privileges on the hive 'default' database. You can work with your Hadoop admin to grant privileges to the cdap user to create tables in the hive database.
For more information, you can google for this error and also read https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Authorization.

Regards,
Ali Anwar

--
You received this message because you are subscribed to the Google Groups "CDAP User" group.
To unsubscribe from this group and stop receiving emails from it, send an email to cdap-user+...@googlegroups.com.
To post to this group, send email to cdap...@googlegroups.com.

Omar Meza

unread,
Aug 6, 2018, 9:20:20 PM8/6/18
to CDAP User
I granted full permission to both dev and default databases. I’ve verified that I can create, select and drop tables as the cdap user. Restarted the cdap service via Ambari.

But, keep getting the same create privileges issue in explorer services log.


Thank you for your help!
Omar

Omar Meza

unread,
Aug 7, 2018, 6:59:18 PM8/7/18
to CDAP User
Seems like user “cdap” is unable to connect to hive metastore via thrift.

Am I missing any impersonation set up for hive ?

2018-08-07 17:09:23,810 - WARN [explore.service-executor-18:o.a.h.h.c.HiveConf@2992] - HiveConf of name hive.heapsize does not exist

2018-08-07 17:09:23,811 - WARN [explore.service-executor-18:o.a.h.h.c.HiveConf@2992] - HiveConf of name hive.optimize.mapjoin.mapreduce does not exist

2018-08-07 17:09:23,811 - WARN [explore.service-executor-18:o.a.h.h.c.HiveConf@2992] - HiveConf of name hive.auto.convert.sortmerge.join.noconditionaltask does not exist

2018-08-07 17:09:23,811 - WARN [explore.service-executor-18:o.a.h.h.c.HiveConf@2992] - HiveConf of name hive.server2.enable.impersonation does not exist

2018-08-07 17:09:23,812 - WARN [explore.service-executor-18:o.a.h.h.c.HiveConf@2992] - HiveConf of name hive.semantic.analyzer.factory.impl does not exist

2018-08-07 17:09:23,812 - WARN [explore.service-executor-18:o.a.h.h.c.HiveConf@2992] - HiveConf of name hive.metastore.token.signature does not exist

2018-08-07 17:09:23,862 - INFO [explore.service-executor-18:h.q.p.ParseDriver@185] - Parsing command: CREATE EXTERNAL TABLE IF NOT EXISTS dataset_test7 (proccode string, modifier string, conditionclass string, diagnosissubset string, coreclass string) COMMENT 'CDAP Dataset' STORED BY 'co.cask.cdap.hive.datasets.DatasetStorageHandler' WITH SERDEPROPERTIES ('explore.dataset.name'='test7', 'explore.dataset.namespace'='devnucleo') TBLPROPERTIES ('cdap.name'='test7', 'cdap.version'='4.3.4-1522017174177')

2018-08-07 17:09:23,863 - INFO [explore.service-executor-18:h.q.p.ParseDriver@209] - Parse Completed

2018-08-07 17:09:23,875 - WARN [explore.service-executor-18:o.a.h.h.m.RetryingMetaStoreClient@187] - MetaStoreClient lost connection. Attempting to reconnect.

org.apache.thrift.transport.TTransportException: java.net.SocketException: Broken pipe

at org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:161) ~[org.apache.thrift.libthrift-0.9.3.jar:0.9.3]

at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:73) ~[org.apache.thrift.libthrift-0.9.3.jar:0.9.3]

at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:62) ~[org.apache.thrift.libthrift-0.9.3.jar:0.9.3]

at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.send_get_table(ThriftHiveMetastore.java:1224) ~[1533306095859-hive-jdbc-1.2.1000.2.5.3.0-37-standalone.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1215) ~[1533306095859-hive-jdbc-1.2.1000.2.5.3.0-37-standalone.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:1234) ~[1533306095859-hive-jdbc-1.2.1000.2.5.3.0-37-standalone.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.getTable(SessionHiveMetaStoreClient.java:131) ~[1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at sun.reflect.GeneratedMethodAccessor20.invoke(Unknown Source) ~[na:na]

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_60]

at java.lang.reflect.Method.invoke(Method.java:497) ~[na:1.8.0_60]

at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:159) ~[hive-metastore-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at com.sun.proxy.$Proxy50.getTable(Unknown Source) [na:na]

at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1158) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.getTable(BaseSemanticAnalyzer.java:1419) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.getTable(BaseSemanticAnalyzer.java:1404) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeCreateTable(SemanticAnalyzer.java:11237) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:10316) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10401) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:216) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:230) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:464) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:320) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1219) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1213) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:146) [1533306095859-hive-jdbc-1.2.1000.2.5.3.0-37-standalone.jar:1.2.1000.2.5.3.0-37]

at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:226) [1533306095859-hive-jdbc-1.2.1000.2.5.3.0-37-standalone.jar:1.2.1000.2.5.3.0-37]

at org.apache.hive.service.cli.operation.Operation.run(Operation.java:276) [1533306095859-hive-jdbc-1.2.1000.2.5.3.0-37-standalone.jar:1.2.1000.2.5.3.0-37]

at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:468) [1533306095859-hive-jdbc-1.2.1000.2.5.3.0-37-standalone.jar:1.2.1000.2.5.3.0-37]

at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:450) [1533306095859-hive-jdbc-1.2.1000.2.5.3.0-37-standalone.jar:1.2.1000.2.5.3.0-37]

at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:286) [1533306095859-hive-jdbc-1.2.1000.2.5.3.0-37-standalone.jar:1.2.1000.2.5.3.0-37]

at co.cask.cdap.explore.service.hive.Hive14ExploreService.executeAsync(Hive14ExploreService.java:156) [na:na]

at co.cask.cdap.explore.service.hive.BaseHiveExploreService.execute(BaseHiveExploreService.java:915) [na:na]

at co.cask.cdap.explore.service.hive.BaseHiveExploreService.execute(BaseHiveExploreService.java:894) [na:na]

at co.cask.cdap.explore.service.ExploreTableManager.enableDataset(ExploreTableManager.java:190) [na:na]

at co.cask.cdap.explore.executor.ExploreExecutorHttpHandler$3.call(ExploreExecutorHttpHandler.java:218) [na:na]

at co.cask.cdap.explore.executor.ExploreExecutorHttpHandler$3.call(ExploreExecutorHttpHandler.java:215) [na:na]

at co.cask.cdap.security.impersonation.ImpersonationUtils$1.run(ImpersonationUtils.java:47) [na:na]

at java.security.AccessController.doPrivileged(Native Method) [na:1.8.0_60]

at javax.security.auth.Subject.doAs(Subject.java:422) [na:1.8.0_60]

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) [hadoop-common-2.7.3.2.5.3.0-37.jar:na]

at co.cask.cdap.security.impersonation.ImpersonationUtils.doAs(ImpersonationUtils.java:44) [na:na]

at co.cask.cdap.security.impersonation.DefaultImpersonator.doAs(DefaultImpersonator.java:74) [na:na]

at co.cask.cdap.security.impersonation.DefaultImpersonator.doAs(DefaultImpersonator.java:63) [na:na]

at co.cask.cdap.explore.executor.ExploreExecutorHttpHandler.enableDataset(ExploreExecutorHttpHandler.java:215) [na:na]

at co.cask.cdap.explore.executor.ExploreExecutorHttpHandler.enableInternal(ExploreExecutorHttpHandler.java:178) [na:na]

at sun.reflect.GeneratedMethodAccessor32.invoke(Unknown Source) ~[na:na]

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_60]

at java.lang.reflect.Method.invoke(Method.java:497) ~[na:1.8.0_60]

at co.cask.http.HttpMethodInfo.invoke(HttpMethodInfo.java:80) [co.cask.http.netty-http-0.16.0.jar:na]

at co.cask.http.HttpDispatcher.messageReceived(HttpDispatcher.java:38) [co.cask.http.netty-http-0.16.0.jar:na]

at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) [io.netty.netty-3.6.6.Final.jar:na]

at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [io.netty.netty-3.6.6.Final.jar:na]

at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [io.netty.netty-3.6.6.Final.jar:na]

at org.jboss.netty.handler.execution.ChannelUpstreamEventRunnable.doRun(ChannelUpstreamEventRunnable.java:43) [io.netty.netty-3.6.6.Final.jar:na]

at org.jboss.netty.handler.execution.ChannelEventRunnable.run(ChannelEventRunnable.java:67) [io.netty.netty-3.6.6.Final.jar:na]

at org.jboss.netty.handler.execution.OrderedMemoryAwareThreadPoolExecutor$ChildExecutor.run(OrderedMemoryAwareThreadPoolExecutor.java:314) [io.netty.netty-3.6.6.Final.jar:na]

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_60]

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_60]

at java.lang.Thread.run(Thread.java:745) [na:1.8.0_60]

Caused by: java.net.SocketException: Broken pipe

at java.net.SocketOutputStream.socketWrite0(Native Method) ~[na:1.8.0_60]

at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:109) ~[na:1.8.0_60]

at java.net.SocketOutputStream.write(SocketOutputStream.java:153) ~[na:1.8.0_60]

at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) ~[na:1.8.0_60]

at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) ~[na:1.8.0_60]

at org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:159) ~[org.apache.thrift.libthrift-0.9.3.jar:0.9.3]

... 58 common frames omitted

2018-08-07 17:09:28,884 - WARN [explore.service-executor-18:o.a.t.t.TIOStreamTransport@112] - Error closing output stream.

java.net.SocketException: Socket closed

at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:116) ~[na:1.8.0_60]

at java.net.SocketOutputStream.write(SocketOutputStream.java:153) ~[na:1.8.0_60]

at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) ~[na:1.8.0_60]

at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) ~[na:1.8.0_60]

at java.io.FilterOutputStream.close(FilterOutputStream.java:158) ~[na:1.8.0_60]

at org.apache.thrift.transport.TIOStreamTransport.close(TIOStreamTransport.java:110) ~[org.apache.thrift.libthrift-0.9.3.jar:0.9.3]

at org.apache.thrift.transport.TSocket.close(TSocket.java:235) [org.apache.thrift.libthrift-0.9.3.jar:0.9.3]

at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.close(HiveMetaStoreClient.java:526) [1533306095859-hive-jdbc-1.2.1000.2.5.3.0-37-standalone.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.reconnect(HiveMetaStoreClient.java:324) [1533306095859-hive-jdbc-1.2.1000.2.5.3.0-37-standalone.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:151) [hive-metastore-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at com.sun.proxy.$Proxy50.getTable(Unknown Source) [na:na]

at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1158) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.getTable(BaseSemanticAnalyzer.java:1419) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.getTable(BaseSemanticAnalyzer.java:1404) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeCreateTable(SemanticAnalyzer.java:11237) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:10316) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10401) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:216) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:230) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:464) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:320) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1219) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1213) [1533306089975-hive-exec-1.2.1000.2.5.3.0-37.jar:1.2.1000.2.5.3.0-37]

at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:146) [1533306095859-hive-jdbc-1.2.1000.2.5.3.0-37-standalone.jar:1.2.1000.2.5.3.0-37]

at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:226) [1533306095859-hive-jdbc-1.2.1000.2.5.3.0-37-standalone.jar:1.2.1000.2.5.3.0-37]

at org.apache.hive.service.cli.operation.Operation.run(Operation.java:276) [1533306095859-hive-jdbc-1.2.1000.2.5.3.0-37-standalone.jar:1.2.1000.2.5.3.0-37]

at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:468) [1533306095859-hive-jdbc-1.2.1000.2.5.3.0-37-standalone.jar:1.2.1000.2.5.3.0-37]

at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:450) [1533306095859-hive-jdbc-1.2.1000.2.5.3.0-37-standalone.jar:1.2.1000.2.5.3.0-37]

at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:286) [1533306095859-hive-jdbc-1.2.1000.2.5.3.0-37-standalone.jar:1.2.1000.2.5.3.0-37]

at co.cask.cdap.explore.service.hive.Hive14ExploreService.executeAsync(Hive14ExploreService.java:156) [na:na]

at co.cask.cdap.explore.service.hive.BaseHiveExploreService.execute(BaseHiveExploreService.java:915) [na:na]

at co.cask.cdap.explore.service.hive.BaseHiveExploreService.execute(BaseHiveExploreService.java:894) [na:na]

at co.cask.cdap.explore.service.ExploreTableManager.enableDataset(ExploreTableManager.java:190) [na:na]

at co.cask.cdap.explore.executor.ExploreExecutorHttpHandler$3.call(ExploreExecutorHttpHandler.java:218) [na:na]

at co.cask.cdap.explore.executor.ExploreExecutorHttpHandler$3.call(ExploreExecutorHttpHandler.java:215) [na:na]

at co.cask.cdap.security.impersonation.ImpersonationUtils$1.run(ImpersonationUtils.java:47) [na:na]

at java.security.AccessController.doPrivileged(Native Method) [na:1.8.0_60]

at javax.security.auth.Subject.doAs(Subject.java:422) [na:1.8.0_60]

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) [hadoop-common-2.7.3.2.5.3.0-37.jar:na]

at co.cask.cdap.security.impersonation.ImpersonationUtils.doAs(ImpersonationUtils.java:44) [na:na]

at co.cask.cdap.security.impersonation.DefaultImpersonator.doAs(DefaultImpersonator.java:74) [na:na]

at co.cask.cdap.security.impersonation.DefaultImpersonator.doAs(DefaultImpersonator.java:63) [na:na]

at co.cask.cdap.explore.executor.ExploreExecutorHttpHandler.enableDataset(ExploreExecutorHttpHandler.java:215) [na:na]

at co.cask.cdap.explore.executor.ExploreExecutorHttpHandler.enableInternal(ExploreExecutorHttpHandler.java:178) [na:na]

at sun.reflect.GeneratedMethodAccessor32.invoke(Unknown Source) ~[na:na]

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_60]

at java.lang.reflect.Method.invoke(Method.java:497) ~[na:1.8.0_60]

at co.cask.http.HttpMethodInfo.invoke(HttpMethodInfo.java:80) [co.cask.http.netty-http-0.16.0.jar:na]

at co.cask.http.HttpDispatcher.messageReceived(HttpDispatcher.java:38) [co.cask.http.netty-http-0.16.0.jar:na]

at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) [io.netty.netty-3.6.6.Final.jar:na]

at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [io.netty.netty-3.6.6.Final.jar:na]

at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [io.netty.netty-3.6.6.Final.jar:na]

at org.jboss.netty.handler.execution.ChannelUpstreamEventRunnable.doRun(ChannelUpstreamEventRunnable.java:43) [io.netty.netty-3.6.6.Final.jar:na]

at org.jboss.netty.handler.execution.ChannelEventRunnable.run(ChannelEventRunnable.java:67) [io.netty.netty-3.6.6.Final.jar:na]

at org.jboss.netty.handler.execution.OrderedMemoryAwareThreadPoolExecutor$ChildExecutor.run(OrderedMemoryAwareThreadPoolExecutor.java:314) [io.netty.netty-3.6.6.Final.jar:na]

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_60]

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_60]

at java.lang.Thread.run(Thread.java:745) [na:1.8.0_60]

2018-08-07 17:09:28,886 - INFO [explore.service-executor-18:h.metastore@402] - Trying to connect to metastore with URI thrift://xxxxxxx.xxxxx.xxxxxx.xxxxx:9083

2018-08-07 17:09:28,887 - INFO [explore.service-executor-18:h.metastore@498] - Connected to metastore.

2018-08-07 17:09:28,966 - WARN [explore.service-executor-18:E.stderr@126] - Authorization failed:No privilege 'Create' found for outputs { database:dev}. Use SHOW GRANT to get more details.

2018-08-07 17:09:28,967 - ERROR [explore.service-executor-18:o.a.h.h.q.Driver@989] - Authorization failed:No privilege 'Create' found for outputs { database:dev}. Use SHOW GRANT to get more details.

2018-08-07 17:10:15,229 - ERROR [explore.service-executor-15:h.log@1239] - Got exception: org.apache.thrift.transport.TTransportException java.net.SocketException: Broken pipe

org.apache.thrift.transport.TTransportException: java.net.SocketException: Broken pipe

at org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:161) ~[org.apache.thrift.libthrift-0.9.3.jar:0.9.3]

at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:73) ~[org.apache.thrift.libthrift-0.9.3.jar:0.9.3]

at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:62) ~[org.apache.thrift.libthrift-0.9.3.jar:0.9.3]

at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.send_get_all_tables(ThriftHiveMetastore.java:1197) ~[1533306095859-hive-jdbc-1.2.1000.2.5.3.0-37-standalone.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_all_tables(ThriftHiveMetastore.java:1189) ~[1533306095859-hive-jdbc-1.2.1000.2.5.3.0-37-standalone.jar:1.2.1000.2.5.3.0-37]

at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllTables(HiveMetaStoreClient.java:1288) ~[1533306095859-hive-jdbc-1.2.1000.2.5.3.0-37-standalone.jar:1.2.1000.2.5.3.0-37]

at co.cask.cdap.explore.service.hive.BaseHiveExploreService.getTables(BaseHiveExploreService.java:597) [na:na]

at co.cask.cdap.explore.executor.NamespacedExploreMetadataHttpHandler$1.call(NamespacedExploreMetadataHttpHandler.java:73) [na:na]

at co.cask.cdap.explore.executor.NamespacedExploreMetadataHttpHandler$1.call(NamespacedExploreMetadataHttpHandler.java:70) [na:na]

at co.cask.cdap.security.impersonation.ImpersonationUtils$1.run(ImpersonationUtils.java:47) [na:na]

at java.security.AccessController.doPrivileged(Native Method) [na:1.8.0_60]

at javax.security.auth.Subject.doAs(Subject.java:422) [na:1.8.0_60]

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) [hadoop-common-2.7.3.2.5.3.0-37.jar:na]

at co.cask.cdap.security.impersonation.ImpersonationUtils.doAs(ImpersonationUtils.java:44) [na:na]

at co.cask.cdap.security.impersonation.DefaultImpersonator.doAs(DefaultImpersonator.java:74) [na:na]

at co.cask.cdap.security.impersonation.DefaultImpersonator.doAs(DefaultImpersonator.java:63) [na:na]

at co.cask.cdap.explore.executor.NamespacedExploreMetadataHttpHandler.getTables(NamespacedExploreMetadataHttpHandler.java:70) [na:na]

at sun.reflect.GeneratedMethodAccessor9.invoke(Unknown Source) ~[na:na]

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_60]

at java.lang.reflect.Method.invoke(Method.java:497) ~[na:1.8.0_60]

at co.cask.http.HttpMethodInfo.invoke(HttpMethodInfo.java:80) [co.cask.http.netty-http-0.16.0.jar:na]

at co.cask.http.HttpDispatcher.messageReceived(HttpDispatcher.java:38) [co.cask.http.netty-http-0.16.0.jar:na]

at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) [io.netty.netty-3.6.6.Final.jar:na]

at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [io.netty.netty-3.6.6.Final.jar:na]

at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [io.netty.netty-3.6.6.Final.jar:na]

at org.jboss.netty.handler.execution.ChannelUpstreamEventRunnable.doRun(ChannelUpstreamEventRunnable.java:43) [io.netty.netty-3.6.6.Final.jar:na]

at org.jboss.netty.handler.execution.ChannelEventRunnable.run(ChannelEventRunnable.java:67) [io.netty.netty-3.6.6.Final.jar:na]

at org.jboss.netty.handler.execution.OrderedMemoryAwareThreadPoolExecutor$ChildExecutor.run(OrderedMemoryAwareThreadPoolExecutor.java:314) [io.netty.netty-3.6.6.Final.jar:na]

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_60]

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_60]

at java.lang.Thread.run(Thread.java:745) [na:1.8.0_60]

Caused by: java.net.SocketException: Broken pipe

at java.net.SocketOutputStream.socketWrite0(Native Method) ~[na:1.8.0_60]

at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:109) ~[na:1.8.0_60]

at java.net.SocketOutputStream.write(SocketOutputStream.java:153) ~[na:1.8.0_60]

at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) ~[na:1.8.0_60]

at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) ~[na:1.8.0_60]

at org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:159) ~[org.apache.thrift.libthrift-0.9.3.jar:0.9.3]

... 30 common frames omitted

2018-08-07 17:10:15,229 - ERROR [explore.service-executor-15:h.log@1240] - Converting exception to MetaException

alia...@google.com

unread,
Aug 17, 2018, 6:11:49 PM8/17/18
to CDAP User
Reply all
Reply to author
Forward
0 new messages