ERROR org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:root (auth:SIMPLE) cause:java.io.IOException: java.io.IOException: Broken

1,145 views
Skip to first unread message

Srinivasan Ramalingam

unread,
Mar 20, 2014, 8:00:56 AM3/20/14
to rh...@googlegroups.com
Hi Guys,
            I am running Rhipe map-reduce Program in Cloudera Manager Cdh4 version. when I run my Rhipe map-reduce  with small dataset. it is working fine. when run a same mapreduce program with 12 gb data set. i am getting the error on one particular data node. how to rectify this error. any body can help me solve this issue.

2014-03-20 17:19:33,143 ERROR org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:root (auth:SIMPLE) cause:java.io.IOException: java.io.IOException: Broken pipe
2014-03-20 17:19:33,143 WARN org.apache.hadoop.mapred.Child: Error running child
java.io.IOException: java.io.IOException: Broken pipe
	at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:112)
	at org.godhuli.rhipe.RHMRMapper.run(RHMRMapper.java:58)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.io.IOException: Broken pipe
	at java.io.FileOutputStream.writeBytes(Native Method)
	at java.io.FileOutputStream.write(FileOutputStream.java:282)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:105)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:109)
	at java.io.DataOutputStream.write(DataOutputStream.java:90)
	at org.godhuli.rhipe.RHBytesWritable.write(RHBytesWritable.java:115)
	at org.godhuli.rhipe.RHMRHelper.write(RHMRHelper.java:290)
	at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:107)
	... 8 more
2014-03-20 17:19:33,148 INFO org.apache.hadoop.mapred.Task: Runnning cleanup for the task

Ondrej Nekola

unread,
Mar 20, 2014, 11:31:00 AM3/20/14
to rh...@googlegroups.com
Do you have enough space?
   O.N.

--

---
You received this message because you are subscribed to the Google Groups "rhipe" group.
To unsubscribe from this group and stop receiving emails from it, send an email to rhipe+un...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Saptarshi Guha

unread,
Mar 20, 2014, 11:39:13 AM3/20/14
to rh...@googlegroups.com
Ideally the erros should come back to you. RHIPE is not perfect. alas.

what you need to do is go the jobtracker, then go to the failed tasks
and see the log link. Very likely, you'll see an R error in the logs.
Most likely, the job failed so quickly there was not enough time
report back the errors.

Srinivasan Ramalingam

unread,
Mar 21, 2014, 12:40:05 AM3/21/14
to rh...@googlegroups.com
Hi guys,
     i solved that error. The R home path is not available in the  particular data node. so that the problem was occurred. what i did i set the correct R home path. it is working perfectly.
Reply all
Reply to author
Forward
0 new messages