Query from hive UI returned java.lang.IllegalArgumentException: argument was null:Is null- arg1? true arg2? true error

18 views
Skip to first unread message

Norliana Maleh

unread,
Jun 15, 2017, 5:43:57 AM6/15/17
to Hue-Users
Hi,

Not sure if this question has been asked before.

I'm using HDP 2.0 and hive 0.12.0.2.0. Based on recommendation I've read, I'm installing hue 3.0. Also I have hive and accumulo integration where I can create hive table containing data from accumulo table. Using hive shell, hive -hiveconf accumulo.instance.name=accumulo -hiveconf accumulo.zookeepers=localhost -hiveconf accumulo.user.name=hive -hiveconf accumulo.user.pass=hive, I can perform query on tables that I have created. However from Hue (via hive UI), I have issues:

Number of reduce tasks is set to 0 since there's no reduce operator
java.lang.IllegalArgumentException: argument was null:Is null- arg1? true arg2? true
        at org.apache.accumulo.core.util.ArgumentChecker.notNull(ArgumentChecker.java:36)
        at org.apache.accumulo.core.client.ZooKeeperInstance.<init>(ZooKeeperInstance.java:100)
        at org.apache.accumulo.core.client.ZooKeeperInstance.<init>(ZooKeeperInstance.java:86)
        at com.xxx.xxxx.analytics.tools.storagehandler.HiveAccumuloTableInputFormat.getInstance(HiveAccumuloTableInputFormat.java:152)
        at com.xxx.xxx.analytics.tools.storagehandler.HiveAccumuloTableInputFormat.getSplits(HiveAccumuloTableInputFormat.java:103)
        at org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:294)
        at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getSplits(CombineHiveInputFormat.java:303)
        at org.apache.hadoop.mapreduce.JobSubmitter.writeOldSplits(JobSubmitter.java:518)
        at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:510)
        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:392)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1528)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
        at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
        at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1528)
        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
        at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
        at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:425)
        at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)
        at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1437)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1215)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1043)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
        at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:102)
        at org.apache.hive.service.cli.operation.SQLOperation.access$000(SQLOperation.java:62)
        at org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:153)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Job Submission failed with exception 'java.lang.IllegalArgumentException(argument was null:Is null- arg1? true arg2? true)'
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask

I'm quite certain this is due to hive and accumulo integration that I have implemented however from hue, I can't find where I can define accumulo details.  Is there any workaround for this?

Many thanks.

Romain Rigaux

unread,
Jun 15, 2017, 10:12:15 AM6/15/17
to Norliana Maleh, Hue-Users
Do you use 'hive' or 'beeline' as the Hive shell?

--
You received this message because you are subscribed to the Google Groups "Hue-Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hue-user+unsubscribe@cloudera.org.

Norliana Maleh

unread,
Jun 16, 2017, 3:40:30 AM6/16/17
to Hue-Users, norlian...@gmail.com
Hi,

I'm using 'hive' as hive shell.

Thanks,
To unsubscribe from this group and stop receiving emails from it, send an email to hue-user+u...@cloudera.org.

Norliana Maleh

unread,
Jun 18, 2017, 9:40:28 PM6/18/17
to Hue-Users, norlian...@gmail.com
Hi,

Is there any sort of custom/extended configuration I can define for my accumulo details? These are the details I used in the hive shell:

accumulo.zookeepers=localhost 
accumulo.user.pass=hive

Thanks.

Romain Rigaux

unread,
Jun 19, 2017, 2:15:01 AM6/19/17
to Norliana Maleh, Hue-Users
Could you try with 'beeline'? It probably won't work too and it will show the Hive misconfiguration.

To unsubscribe from this group and stop receiving emails from it, send an email to hue-user+unsubscribe@cloudera.org.

Reply all
Reply to author
Forward
0 new messages