hi,
We are using Alluxio over CDH 5.8.
I am trying to access Hive over Alluxio and starting Hive Shell using below command:
1)
hive --hiveconf fs.alluxio.impl=alluxio.hadoop.FileSystem --hiveconf fs.alluxio-ft.impl=alluxio.hadoop.FaultTolerantFileSystem --hiveconf fs.AbstractFileSystem.alluxio.impl=alluxio.hadoop.AlluxioFileSystem --hiveconf
fs.default.name=alluxio://
10.10.5.158:19998
Error given is
Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxr-xr-x
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:540)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:689)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:628)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxr-xr-x
at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:625)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:574)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:518)
... 8 more
2)
If I remove "--hiveconf
fs.default.name=alluxio://
10.10.5.158:19998" from above command and execute, Hive shell starts correctly, but NOT able to run simple Hive count query:
hive> select count(1) from prac_tab;
Query ID = root_20160902122828_7b97d96d-978a-45fd-a396-aa54a61ffe16
Total jobs = 1
Launching Job 1 out of 1
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
set mapreduce.job.reduces=<number>
Process get stuck here only.....
Can any one help us out??
Conf files are:
Attached here
Hive Conf
export HIVE_CONF_DIR=/alluxio_conf/
Thanks,
VikramT