Hi,
Not sure if this question has been asked before.
I'm using HDP 2.0 and hive 0.12.0.2.0. Based on recommendation I've read, I'm installing hue 3.0. Also I have hive and accumulo integration where I can create hive table containing data from accumulo table. Using hive shell, hive -hiveconf
accumulo.instance.name=accumulo -hiveconf accumulo.zookeepers=localhost -hiveconf
accumulo.user.name=hive -hiveconf accumulo.user.pass=hive, I can perform query on tables that I have created. However from Hue (via hive UI), I have issues:
Number of reduce tasks is set to 0 since there's no reduce operator
java.lang.IllegalArgumentException: argument was null:Is null- arg1? true arg2? true
at org.apache.accumulo.core.util.ArgumentChecker.notNull(ArgumentChecker.java:36)
at org.apache.accumulo.core.client.ZooKeeperInstance.<init>(ZooKeeperInstance.java:100)
at org.apache.accumulo.core.client.ZooKeeperInstance.<init>(ZooKeeperInstance.java:86)
at com.xxx.xxxx.analytics.tools.storagehandler.HiveAccumuloTableInputFormat.getInstance(HiveAccumuloTableInputFormat.java:152)
at com.xxx.xxx.analytics.tools.storagehandler.HiveAccumuloTableInputFormat.getSplits(HiveAccumuloTableInputFormat.java:103)
at org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:294)
at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getSplits(CombineHiveInputFormat.java:303)
at org.apache.hadoop.mapreduce.JobSubmitter.writeOldSplits(JobSubmitter.java:518)
at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:510)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:392)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1528)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1528)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:425)
at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1437)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1215)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1043)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:102)
at org.apache.hive.service.cli.operation.SQLOperation.access$000(SQLOperation.java:62)
at org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:153)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Job Submission failed with exception 'java.lang.IllegalArgumentException(argument was null:Is null- arg1? true arg2? true)'
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
I'm quite certain this is due to hive and accumulo integration that I have implemented however from hue, I can't find where I can define accumulo details. Is there any workaround for this?
Many thanks.