java.io.IOException: Permission denied

2,145 views
Skip to first unread message

TD

unread,
Apr 6, 2011, 7:02:11 PM4/6/11
to Hue-Users
I just setup Hue and tried PI example program. As you can see below,
it put the jar file in /tmp instead of the hadoop.tmp.dir directory,
which is set to /scratch/hadoop/hadoop-0.20/cache. It failed as
shown. Then tried to run the command from the command line and it too
would not launch due to Permission Denied error. So I set the
permission to 777 on the /tmp/jobsub-95jokR (it was set to 700)
directory. Then it worked. This doesn't seem to be the right
behavior. So, questions are:

1. what tmp dir should hue use?
2. Is there somewhere in the config files I should set some setting?
If so, what is it?
3. should the user and hue be part of the same group (although with a
perm setting of 700 this won't help either).

Thanks,

TD.

2011-04-06 17:22:24,371 INFO server Copying /user/hue/jobsub/
examples/hadoop-examples.jar->/tmp/jobsub-95jokR/work/tmp.jar
2011-04-06 17:22:24,557 INFO server all_clusters:
[<hadoop.job_tracker.LiveJobTracker object at 0xd3d3690>,
<hadoop.fs.hadoopfs.HadoopFileSystem object at 0xd1b3f10>]
2011-04-06 17:22:24,557 INFO server Starting ['/usr/lib/hadoop-0.20/
bin/hadoop', 'jar', 'tmp.jar', 'pi', '1000', '2']. (Env:
{'HADOOP_CLASSPATH': '/usr/share/hue/apps/jobsub/src/jobsub/../../java-
lib/trace.jar:/usr/share/hue/desktop/libs/hadoop/src/hadoop/../../
static-group-mapping/java-lib/static-group-mapping-1.2.0.jar',
'HUE_JOBTRACE_LOG': '/tmp/jobsub-95jokR/jobs', 'HUE_JOBSUB_USER':
'delsorbo', 'HADOOP_OPTS': '-javaagent:/usr/share/hue/ext/thirdparty/
java/aspectj-1.6.5/aspectjweaver.jar -Dhue.suffix=-via-hue -
Duser.name=delsorbo', 'HUE_JOBSUB_GROUPS': 'delsorbo', 'HADOOP_HOME':
'/usr/lib/hadoop-0.20'})
2011-04-06 17:22:24,557 INFO server Running: /usr/lib/hadoop-0.20/
bin/hadoop jar tmp.jar pi 1000 2
Exception in thread "main" java.io.IOException: Permission denied
at java.io.UnixFileSystem.createFileExclusively(Native Method)
at java.io.File.checkAndCreate(File.java:1704)
at java.io.File.createTempFile(File.java:1792)
at org.apache.hadoop.util.RunJar.main(RunJar.java:146)
2011-04-06 17:22:26,356 ERROR server jobsubd PlanRunner saw
exception.
Traceback (most recent call last):
File "/usr/share/hue/apps/jobsub/src/jobsub/server.py", line 159, in
run
self.run_bin_hadoop_step(step.bin_hadoop_step)
File "/usr/share/hue/apps/jobsub/src/jobsub/server.py", line 278, in
run_bin_hadoop_step
raise Exception("bin/hadoop returned non-zero %d" % retcode)
Exception: bin/hadoop returned non-zero 1
2011-04-06 17:22:26,411 INFO server Marked jobsubd job 6 as done.
2011-04-06 17:22:26,546 INFO server Sent notification email about
job 6.

bc Wong

unread,
Apr 6, 2011, 9:53:13 PM4/6/11
to TD, Hue-Users
On Wed, Apr 6, 2011 at 4:02 PM, TD <anthony....@gmail.com> wrote:
> I just setup Hue and tried PI example program.  As you can see below,
> it put the jar file in /tmp instead of the hadoop.tmp.dir directory,
> which is set to /scratch/hadoop/hadoop-0.20/cache.  It failed as
> shown.  Then tried to run the command from the command line and it too
> would not launch due to Permission Denied error.  So I set the
> permission to 777 on the /tmp/jobsub-95jokR (it was set to 700)
> directory.  Then it worked.  This doesn't seem to be the right
> behavior.  So, questions are:
>
> 1.  what tmp dir should hue use?
> 2.  Is there somewhere in the config files I should set some setting?
> If so, what is it?
> 3.  should the user and hue be part of the same group (although with a
> perm setting of 700 this won't help either).

Hi TD,

Here's an explanation along side with the log statements.

> 2011-04-06 17:22:24,371 INFO   server Copying /user/hue/jobsub/
> examples/hadoop-examples.jar->/tmp/jobsub-95jokR/work/tmp.jar

To run a job, we need a jar. The jar in this example is stored in
HDFS. Here Hue is simply copying the jar from the cluster to its
local machine for submission. This isn't the error.

> 2011-04-06 17:22:24,557 INFO   server all_clusters:
> [<hadoop.job_tracker.LiveJobTracker object at 0xd3d3690>,
> <hadoop.fs.hadoopfs.HadoopFileSystem object at 0xd1b3f10>]
> 2011-04-06 17:22:24,557 INFO   server Starting ['/usr/lib/hadoop-0.20/
> bin/hadoop', 'jar', 'tmp.jar', 'pi', '1000', '2'].  (Env:
> {'HADOOP_CLASSPATH': '/usr/share/hue/apps/jobsub/src/jobsub/../../java-
> lib/trace.jar:/usr/share/hue/desktop/libs/hadoop/src/hadoop/../../
> static-group-mapping/java-lib/static-group-mapping-1.2.0.jar',
> 'HUE_JOBTRACE_LOG': '/tmp/jobsub-95jokR/jobs', 'HUE_JOBSUB_USER':
> 'delsorbo', 'HADOOP_OPTS': '-javaagent:/usr/share/hue/ext/thirdparty/
> java/aspectj-1.6.5/aspectjweaver.jar -Dhue.suffix=-via-hue -
> Duser.name=delsorbo', 'HUE_JOBSUB_GROUPS': 'delsorbo', 'HADOOP_HOME':
> '/usr/lib/hadoop-0.20'})
> 2011-04-06 17:22:24,557 INFO   server Running: /usr/lib/hadoop-0.20/
> bin/hadoop jar tmp.jar pi 1000 2

Is this the command that you tried on the CLI and gave an error?
Hue is now calling Hadoop to submit a job.

> Exception in thread "main" java.io.IOException: Permission denied
>        at java.io.UnixFileSystem.createFileExclusively(Native Method)
>        at java.io.File.checkAndCreate(File.java:1704)
>        at java.io.File.createTempFile(File.java:1792)
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:146)

This error comes from within Hadoop. Unfortunately it doesn't
tell us what the path is. Most likely your hadoop.tmp.dir
(which is a local directory on the Hue node) is not writable.
The job submission needs to unpacks the jar in there. When you
run hadoop on the commandline, hadoop.tmp.dir needs to be
writable to you. When you do it through Hue, it needs to be
writable to the `hue' user.

One easy fix is to set your hadoop.tmp.dir to 0777. Or see this
discussion:
http://archive.cloudera.com/cdh/3/hue/manual.html#_further_hadoop_configuration_and_caveats

Cheers,
--
bc Wong
Cloudera Software Engineer

Reply all
Reply to author
Forward
0 new messages