Rhipe_0.75.0_cdh5 - java.io.IOException: Stream Closed

345 views
Skip to first unread message

Luke Hallett

unread,
May 20, 2014, 4:04:36 PM5/20/14
to rh...@googlegroups.com
I have downloaded the rearch branch and compiled against CDH5.  I have everything working fine in a single-node environment, but when I try to run my map/reduce job in a 2 node cluster, I get the following:

> z = rhwatch(map=m,reduce=r,input=rhfmt("/datasets/50kRows.tsv",type="text"),output="/output/test2",read=FALSE)
Saving 1 parameter to /tmp/rhipe-temp-params-d6df9b466a9f4e1e12861e6b47ad979f (use rhclean to delete all temp files)
[Tue May 20 07:56:00 2014] Name:2014-05-20 07:56:00 Job: job_1400604042731_0004  State: PREP Duration: -21592.514
       pct numtasks pending running complete killed failed_attempts killed_attempts
map      0        0       0       0        0      0               0               0
reduce   0        0       0       0        0      0               0               0
Waiting 5 seconds
[Tue May 20 07:56:06 2014] Name:2014-05-20 07:56:00 Job: job_1400604042731_0004  State: PREP Duration: -21587.375
       pct numtasks pending running complete killed failed_attempts killed_attempts
map      0        0       0       0        0      0               0               0
reduce   0        0       0       0        0      0               0               0
Waiting 5 seconds
[Tue May 20 07:56:11 2014] Name:2014-05-20 07:56:00 Job: job_1400604042731_0004  State: PREP Duration: -21582.243
       pct numtasks pending running complete killed failed_attempts killed_attempts
map      0        0       0       0        0      0               0               0
reduce   0        0       0       0        0      0               0               0
Waiting 5 seconds
[Tue May 20 07:56:16 2014] Name:2014-05-20 07:56:00 Job: job_1400604042731_0004  State: RUNNING Duration: -21576.928
       pct numtasks pending running complete killed failed_attempts killed_attempts
map      0        1       0       1        0      0               0               0
reduce   0        8       8       0        0      0               0               0
Waiting 5 seconds
There were Hadoop specific errors (autokill will not kill job), showing at most 30:
Error: java.io.IOException: java.io.IOException: Stream closed
at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:132)
at org.godhuli.rhipe.RHMRMapper.run(RHMRMapper.java:60)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.io.IOException: Stream closed
at java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:434)
at java.io.OutputStream.write(OutputStream.java:116)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)
at java.io.DataOutputStream.write(DataOutputStream.java:107)
at org.godhuli.rhipe.RHBytesWritable.write(RHBytesWritable.java:121)
at org.godhuli.rhipe.RHMRHelper.write(RHMRHelper.java:341)
at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:126)
... 8 more

I have been combing through logs from the ResourceManager, and haven't been able to find anything to help troubleshoot.  Any ideas?

Thanks!
Luke.

Saptarshi Guha

unread,
May 20, 2014, 4:08:11 PM5/20/14
to rh...@googlegroups.com
We will be upgrading to CDH5 today. I can check this out.
However are you using the new Yarn based MR? or old MR?
> --
>
> ---
> You received this message because you are subscribed to the Google Groups
> "rhipe" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to rhipe+un...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.

Luke Hallett

unread,
May 20, 2014, 4:11:16 PM5/20/14
to rh...@googlegroups.com, saptars...@gmail.com
Honestly, I'm not sure which one it being launched.  I have both setup on my Cluster, and am seeing the job listed under the YARN ResourceManager.  So, I am assuming it is using YARN :)

I am pretty new to Hadoop, and am not sure how/where to configure which one to use.

Saptarshi Guha

unread,
May 20, 2014, 4:13:15 PM5/20/14
to Luke Hallett, rh...@googlegroups.com
I;m not sure if RHIPE works on MR2. I'll get back you

Luke Hallett

unread,
May 20, 2014, 5:26:10 PM5/20/14
to rh...@googlegroups.com, Luke Hallett, saptars...@gmail.com
Just to give you a bit of an update...

I removed the Yarn Service from my CDH5 Cluster, and re-ran my map/reduce job from R (via rhwatch), and I am getting the same error.

Here is the failed Map log I get from my Job Tracker UI:
AttemptTaskMachineStateErrorLogs
attempt_201405200838_0002_m_000000_0task_201405200838_0002_m_000000es-cdh-node2FAILED
java.io.IOException: java.io.IOException: Stream closed
	at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:132)
	at org.godhuli.rhipe.RHMRMapper.run(RHMRMapper.java:60)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.io.IOException: Stream closed
	at java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:434)
	at java.io.OutputStream.write(OutputStream.java:116)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)
	at java.io.Da
Last 4KB
Last 8KB
All
attempt_201405200838_0002_m_000000_1task_201405200838_0002_m_000000es-cdh-node1FAILED
java.io.IOException: java.io.IOException: Stream closed
	at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:132)
	at org.godhuli.rhipe.RHMRMapper.run(RHMRMapper.java:60)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.io.IOException: Stream closed
	at java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:434)
	at java.io.OutputStream.write(OutputStream.java:116)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)
	at java.io.Da
Last 4KB
Last 8KB
All
attempt_201405200838_0002_m_000000_2task_201405200838_0002_m_000000es-cdh-node1FAILED
java.io.IOException: java.io.IOException: Stream closed
	at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:132)
	at org.godhuli.rhipe.RHMRMapper.run(RHMRMapper.java:60)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.io.IOException: Stream closed
	at java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:434)
	at java.io.OutputStream.write(OutputStream.java:116)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)
	at java.io.Da
Last 4KB
Last 8KB
All
attempt_201405200838_0002_m_000000_3task_201405200838_0002_m_000000es-cdh-node1FAILED
java.io.IOException: java.io.IOException: Stream closed
	at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:132)
	at org.godhuli.rhipe.RHMRMapper.run(RHMRMapper.java:60)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.io.IOException: Stream closed
	at java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:434)
	at java.io.OutputStream.write(OutputStream.java:116)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)
	at java.io.Da
Last 4KB
Last 8KB
All


Hopefully that helps narrow things down a bit.

Saptarshi Guha

unread,
May 20, 2014, 5:28:51 PM5/20/14
to Luke Hallett, rh...@googlegroups.com
If you click on the "ALL" links on the right, and scroll down you sometimes find an R error (yeah, not cool)
Do you find any?

Luke Hallett

unread,
May 20, 2014, 5:58:00 PM5/20/14
to rh...@googlegroups.com, Luke Hallett, saptars...@gmail.com
I'm not seeing anything, but here is the output of the "All" link from one of the nodes (maybe I'm missing it, been staring at logs all day :):

Task Logs: 'attempt_201405200838_0002_m_000000_1'



stdout logs
Classloader sun.misc.Launcher$AppClassLoader@566d0085:
file:/mapred/local/taskTracker/es/jobcache/job_201405200838_0002/jars/classes
file:/mapred/local/taskTracker/es/jobcache/job_201405200838_0002/jars/job.jar
file:/mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/
file:/var/run/cloudera-scm-agent/process/118-mapreduce-TASKTRACKER/
file:/usr/java/jdk1.7.0_45-cloudera/lib/tools.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/hadoop-core-2.3.0-mr1-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/activation-1.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/ant-contrib-1.0b3.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/asm-3.2.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/avro/avro-compiler-1.7.5-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/avro/avro-1.7.5-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/commons-beanutils-1.7.0.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/commons-beanutils-core-1.8.0.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/commons-cli-1.2.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/commons-codec-1.4.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/commons-collections-3.2.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/commons-compress-1.4.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/commons-configuration-1.6.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/commons-digester-1.8.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/commons-el-1.0.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/commons-httpclient-3.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/commons-io-2.4.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/commons-lang-2.6.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/commons-logging-1.1.3.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/commons-math3-3.1.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/commons-net-3.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/guava-11.0.2.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/hadoop-fairscheduler-2.3.0-mr1-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/hadoop-fairscheduler.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/hsqldb-1.8.0.10.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/httpclient-4.2.5.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/httpcore-4.2.5.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/jackson-core-asl-1.8.8.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/jackson-jaxrs-1.8.8.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/jackson-mapper-asl-1.8.8.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/jackson-xc-1.8.8.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/jasper-compiler-5.5.23.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/jasper-runtime-5.5.23.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/java-xmlbuilder-0.4.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/jaxb-api-2.2.2.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/jaxb-impl-2.2.3-1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/jersey-core-1.9.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/jersey-json-1.9.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/jersey-server-1.9.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/jets3t-0.9.0.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/jettison-1.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/jetty-6.1.26.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/jetty-util-6.1.26.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/jline-0.9.94.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/jsch-0.1.42.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/jsp-api-2.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/jsr305-1.3.9.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/junit-4.8.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/kfs-0.2.2.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/log4j-1.2.17.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/mockito-all-1.8.5.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/paranamer-2.3.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/protobuf-java-2.5.0.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/servlet-api-2.5.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/slf4j-api-1.7.5.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/snappy-java-1.0.4.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/stax-api-1.0-2.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/xmlenc-0.52.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/xz-1.0.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/zookeeper-3.4.5-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/jsp-2.1/jsp-2.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-0.20-mapreduce/lib/jsp-2.1/jsp-api-2.1.jar
file:/usr/share/cmf/lib/plugins/event-publish-5.0.1-shaded.jar
file:/usr/share/cmf/lib/plugins/tt-instrumentation-5.0.1.jar
file:/usr/share/cmf/lib/plugins/cdh5/navigator-cdh5-plugin-5.0.1-shaded.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/commons-io-2.4.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/jsp-api-2.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/servlet-api-2.5.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/log4j-1.2.17.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.8.8.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/commons-codec-1.4.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/asm-3.2.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/jsr305-1.3.9.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/commons-el-1.0.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/jasper-runtime-5.5.23.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/jackson-core-asl-1.8.8.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/jersey-core-1.9.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/commons-lang-2.6.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/jetty-6.1.26.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/commons-cli-1.2.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/jersey-server-1.9.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/xmlenc-0.52.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/lib/guava-11.0.2.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/hadoop-hdfs-2.3.0-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.3.0-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/hadoop-hdfs-2.3.0-cdh5.0.1-tests.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/hadoop-hdfs-2.3.0-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.3.0-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hue/desktop/libs/hadoop/java-lib/hue-plugins-3.5.0-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/jets3t-0.9.0.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/commons-io-2.4.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/jersey-json-1.9.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/slf4j-api-1.7.5.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/jsp-api-2.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/commons-beanutils-1.7.0.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/servlet-api-2.5.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/jaxb-api-2.2.2.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/httpclient-4.2.5.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/log4j-1.2.17.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/jsch-0.1.42.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/jackson-mapper-asl-1.8.8.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/commons-codec-1.4.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/asm-3.2.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/commons-logging-1.1.3.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/jsr305-1.3.9.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/commons-math3-3.1.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/commons-el-1.0.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/commons-collections-3.2.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/avro/avro-1.7.5-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/snappy-java-1.0.4.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/jasper-runtime-5.5.23.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/commons-digester-1.8.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/jackson-core-asl-1.8.8.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/commons-net-3.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/zookeeper/zookeeper-3.4.5-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/jersey-core-1.9.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/httpcore-4.2.5.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/activation-1.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/commons-lang-2.6.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/jetty-util-6.1.26.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/jetty-6.1.26.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/commons-configuration-1.6.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/jackson-xc-1.8.8.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/commons-httpclient-3.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/netty-3.6.2.Final.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/commons-cli-1.2.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/jettison-1.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/stax-api-1.0-2.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/jackson-jaxrs-1.8.8.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/junit-4.8.2.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/jersey-server-1.9.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/protobuf-java-2.5.0.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/mockito-all-1.8.5.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/jasper-compiler-5.5.23.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/commons-compress-1.4.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/java-xmlbuilder-0.4.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/xmlenc-0.52.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/xz-1.0.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/paranamer-2.3.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/lib/guava-11.0.2.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-generator-1.2.5-cdh5.0.1-javadoc.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-hadoop-1.2.5-cdh5.0.1-javadoc.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-pig-1.2.5-cdh5.0.1-javadoc.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-pig-1.2.5-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-format-1.0.0-cdh5.0.1-sources.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-hadoop-bundle-1.2.5-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-generator-1.2.5-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-format-1.0.0-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-scrooge-1.2.5-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-encoding-1.2.5-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-column-1.2.5-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/hadoop-common-2.3.0-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-pig-1.2.5-cdh5.0.1-sources.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-scrooge-1.2.5-cdh5.0.1-javadoc.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-cascading-1.2.5-cdh5.0.1-sources.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-cascading-1.2.5-cdh5.0.1-javadoc.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-avro-1.2.5-cdh5.0.1-javadoc.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-common-1.2.5-cdh5.0.1-sources.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/hadoop-annotations-2.3.0-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/hadoop-common-2.3.0-cdh5.0.1-tests.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-hadoop-bundle-1.2.5-cdh5.0.1-sources.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/hadoop-auth-2.3.0-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-encoding-1.2.5-cdh5.0.1-sources.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-thrift-1.2.5-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-format-1.0.0-cdh5.0.1-javadoc.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-encoding-1.2.5-cdh5.0.1-javadoc.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-thrift-1.2.5-cdh5.0.1-sources.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-test-hadoop2-1.2.5-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/hadoop-auth-2.3.0-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-pig-bundle-1.2.5-cdh5.0.1-sources.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-common-1.2.5-cdh5.0.1-javadoc.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-avro-1.2.5-cdh5.0.1-sources.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/hadoop-nfs-2.3.0-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-generator-1.2.5-cdh5.0.1-sources.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-column-1.2.5-cdh5.0.1-javadoc.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-cascading-1.2.5-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-common-1.2.5-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-hadoop-1.2.5-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-scrooge-1.2.5-cdh5.0.1-sources.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-thrift-1.2.5-cdh5.0.1-javadoc.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/hadoop-annotations-2.3.0-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-avro-1.2.5-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-pig-bundle-1.2.5-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/hadoop-nfs-2.3.0-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-column-1.2.5-cdh5.0.1-sources.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/hadoop/hadoop-common-2.3.0-cdh5.0.1.jar
file:/opt/cloudera/parcels/CDH-5.0.1-1.cdh5.0.1.p0.47/lib/parquet/parquet-hadoop-1.2.5-cdh5.0.1-sources.jar
Classloader sun.misc.Launcher$ExtClassLoader@e3d4817:
file:/usr/java/jdk1.7.0_45-cloudera/jre/lib/ext/zipfs.jar
file:/usr/java/jdk1.7.0_45-cloudera/jre/lib/ext/localedata.jar
file:/usr/java/jdk1.7.0_45-cloudera/jre/lib/ext/sunjce_provider.jar
file:/usr/java/jdk1.7.0_45-cloudera/jre/lib/ext/dnsns.jar
file:/usr/java/jdk1.7.0_45-cloudera/jre/lib/ext/sunec.jar
file:/usr/java/jdk1.7.0_45-cloudera/jre/lib/ext/sunpkcs11.jar
arg count: 6
/usr/local/bin/R
CMD
/usr/local/lib64/R/library/Rhipe/bin/RhipeMapReduce
--slave
--silent
--vanilla
total bytes of args:94



stderr logs
java.io.IOException: Stream closed
	at java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:434)
	at java.io.OutputStream.write(OutputStream.java:116)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)
	at java.io.DataOutputStream.write(DataOutputStream.java:107)
	at org.godhuli.rhipe.RHBytesWritable.write(RHBytesWritable.java:121)
	at org.godhuli.rhipe.RHMRHelper.write(RHMRHelper.java:341)
	at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:126)
	at org.godhuli.rhipe.RHMRMapper.run(RHMRMapper.java:60)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)



syslog logs
2014-05-20 09:20:58,192 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
2014-05-20 09:20:59,001 INFO org.apache.hadoop.mapred.TaskRunner: Creating symlink: /mapred/local/taskTracker/distcache/708072866974901654_1861982439_439906596/es-cdh-node1.corp.adobe.com/tmp/rhipe-temp-params-419d37f9ea5531712b442f9285e9e5a9 <- /mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/rhipe-temp-params-419d37f9ea5531712b442f9285e9e5a9
2014-05-20 09:20:59,009 INFO org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating symlink: /mapred/local/taskTracker/es/jobcache/job_201405200838_0002/jars/job.jar <- /mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/job.jar
2014-05-20 09:20:59,011 INFO org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating symlink: /mapred/local/taskTracker/es/jobcache/job_201405200838_0002/jars/.job.jar.crc <- /mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/.job.jar.crc
2014-05-20 09:20:59,080 INFO org.apache.hadoop.conf.Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
2014-05-20 09:20:59,082 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics with processName=MAP, sessionId=
2014-05-20 09:20:59,737 INFO org.apache.hadoop.util.ProcessTree: setsid exited with exit code 0
2014-05-20 09:20:59,743 INFO org.apache.hadoop.mapred.Task:  Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@6418984
2014-05-20 09:21:00,185 INFO org.apache.hadoop.mapred.MapTask: Processing split: hdfs://es-cdh-node1.corp.adobe.com:8020/datasets/CNBC/50kRows.tsv:0+18831154
2014-05-20 09:21:00,196 INFO org.apache.hadoop.mapred.MapTask: Map output collector class = org.apache.hadoop.mapred.MapTask$MapOutputBuffer
2014-05-20 09:21:00,200 INFO org.apache.hadoop.mapred.MapTask: io.sort.mb = 256
2014-05-20 09:21:00,270 INFO org.apache.hadoop.mapred.MapTask: data buffer = 204010960/255013696
2014-05-20 09:21:00,270 INFO org.apache.hadoop.mapred.MapTask: record buffer = 671088/838860
2014-05-20 09:21:00,381 INFO org.godhuli.rhipe.RHMRHelper: rhipe_setup_map::writing to file:/mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/rhipe_setup_map
2014-05-20 09:21:00,382 INFO org.godhuli.rhipe.RHMRHelper: rhipe_cleanup_map::writing to file:/mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/rhipe_cleanup_map
2014-05-20 09:21:00,382 INFO org.godhuli.rhipe.RHMRHelper: rhipe_cleanup_reduce::writing to file:/mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/rhipe_cleanup_reduce
2014-05-20 09:21:00,383 INFO org.godhuli.rhipe.RHMRHelper: rhipe_setup_reduce::writing to file:/mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/rhipe_setup_reduce
2014-05-20 09:21:00,385 INFO org.godhuli.rhipe.RHMRHelper: rhipe_reduce_prekey::writing to file:/mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/rhipe_reduce_prekey
2014-05-20 09:21:00,387 INFO org.godhuli.rhipe.RHMRHelper: rhipe_map::writing to file:/mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/rhipe_map
2014-05-20 09:21:00,389 INFO org.godhuli.rhipe.RHMRHelper: rhipe_reduce_postkey::writing to file:/mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/rhipe_reduce_postkey
2014-05-20 09:21:00,395 INFO org.godhuli.rhipe.RHMRHelper: rhipe_reduce::writing to file:/mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/rhipe_reduce
2014-05-20 09:21:00,412 INFO org.godhuli.rhipe.RHMRHelper: Mapper:Started external program:/usr/local/bin/R CMD /usr/local/lib64/R/library/Rhipe/bin/RhipeMapReduce --slave --silent --vanilla
2014-05-20 09:21:00,413 INFO org.godhuli.rhipe.RHMRHelper: Mapper:Started Error Thread
2014-05-20 09:21:00,415 INFO org.godhuli.rhipe.RHMRHelper: Mapper:Started Output Thread
2014-05-20 09:21:00,450 INFO org.godhuli.rhipe.RHMRHelper: Mapper:MROutputThread done
2014-05-20 09:21:00,465 INFO org.godhuli.rhipe.RHMRHelper: Mapper:MRErrorThread done
2014-05-20 09:21:00,467 INFO org.godhuli.rhipe.RHMRMapper: QUIIIITING:127
2014-05-20 09:21:00,486 INFO org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs' truncater with mapRetainSize=-1 and reduceRetainSize=-1
2014-05-20 09:21:00,489 WARN org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:es (auth:SIMPLE) cause:java.io.IOException: java.io.IOException: Stream closed
2014-05-20 09:21:00,490 WARN org.apache.hadoop.mapred.Child: Error running child
java.io.IOException: java.io.IOException: Stream closed
	at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:132)
	at org.godhuli.rhipe.RHMRMapper.run(RHMRMapper.java:60)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.io.IOException: Stream closed
	at java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:434)
	at java.io.OutputStream.write(OutputStream.java:116)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)
	at java.io.DataOutputStream.write(DataOutputStream.java:107)
	at org.godhuli.rhipe.RHBytesWritable.write(RHBytesWritable.java:121)
	at org.godhuli.rhipe.RHMRHelper.write(RHMRHelper.java:341)
	at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:126)
	... 8 more
2014-05-20 09:21:00,496 INFO org.apache.hadoop.mapred.Task: Runnning cleanup for the task

Saptarshi Guha

unread,
May 20, 2014, 6:02:01 PM5/20/14
to Luke Hallett, rh...@googlegroups.com
Try this simpler one

r = rhwatch(map=function(a,b) rhcollect(10,1), input=c(1000,1))

There you will have only one log flle to look at (and see errors0


Luke Hallett

unread,
May 20, 2014, 6:17:11 PM5/20/14
to rh...@googlegroups.com, Luke Hallett, saptars...@gmail.com
Same thing with the simplified job.

Here is the output in R:

> rhoptions(runner="/usr/local/bin/R CMD /usr/local/lib64/R/library/Rhipe/bin/RhipeMapReduce --slave --silent --vanilla")
> r = rhwatch(map=function(a,b) rhcollect(10,1), input=c(1000,1))
Saving 2 parameters to /tmp/rhipe-temp-params-8b454afbfa7119654192665a62d1b5a4 (use rhclean to delete all temp files)
2014-05-20 10:09:54,668 WARN  [main][JobClient] Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
[Tue May 20 10:09:55 2014] Name:2014-05-20 10:09:54 Job: job_201405200838_0005  State: PREP Duration: 0.125
       pct numtasks pending running complete killed failed_attempts killed_attempts
map      0        0       0       0        0      0               0               0
reduce   0        0       0       0        0      0               0               0
Waiting 5 seconds
[Tue May 20 10:10:00 2014] Name:2014-05-20 10:09:54 Job: job_201405200838_0005  State: RUNNING Duration: 5.16
       pct numtasks pending running complete killed failed_attempts killed_attempts
map      0        1       0       1        0      0               0               0
reduce   0        1       1       0        0      0               0               0
Waiting 5 seconds
[Tue May 20 10:10:05 2014] Name:2014-05-20 10:09:54 Job: job_201405200838_0005  State: RUNNING Duration: 10.193
       pct numtasks pending running complete killed failed_attempts killed_attempts
map      0        1       0       1        0      0               0               0
reduce   0        1       1       0        0      0               0               0
Waiting 5 seconds
There were Hadoop specific errors (autokill will not kill job), showing at most 30:
java.io.IOException: java.io.IOException: Stream closed
at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:132)
at org.godhuli.rhipe.RHMRMapper.run(RHMRMapper.java:60)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.io.IOException: Stream closed
at java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:434)
at java.io.OutputStream.write(OutputStream.java:116)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)
at java.io.Da[Tue May 20 10:10:10 2014] Name:2014-05-20 10:09:54 Job: job_201405200838_0005  State: RUNNING Duration: 15.282
       pct numtasks pending running complete killed failed_attempts killed_attempts
map      0        1       0       1        0      0               1               0
reduce   0        1       1       0        0      0               0               0
Waiting 5 seconds
There were Hadoop specific errors (autokill will not kill job), showing at most 30:
java.io.IOException: java.io.IOException: Stream closed
at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:132)
at org.godhuli.rhipe.RHMRMapper.run(RHMRMapper.java:60)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.io.IOException: Stream closed
at java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:434)
at java.io.OutputStream.write(OutputStream.java:116)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)
at java.io.Da[Tue May 20 10:10:15 2014] Name:2014-05-20 10:09:54 Job: job_201405200838_0005  State: RUNNING Duration: 20.324
       pct numtasks pending running complete killed failed_attempts killed_attempts
map      0        1       0       1        0      0               1               0
reduce   0        1       1       0        0      0               0               0
Waiting 5 seconds
There were Hadoop specific errors (autokill will not kill job), showing at most 30:
java.io.IOException: java.io.IOException: Stream closed
at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:132)
at org.godhuli.rhipe.RHMRMapper.run(RHMRMapper.java:60)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.io.IOException: Stream closed
at java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:434)
at java.io.OutputStream.write(OutputStream.java:116)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)
at java.io.Da[Tue May 20 10:10:20 2014] Name:2014-05-20 10:09:54 Job: job_201405200838_0005  State: RUNNING Duration: 25.365
       pct numtasks pending running complete killed failed_attempts killed_attempts
map      0        1       0       1        0      0               2               0
reduce   0        1       1       0        0      0               0               0
Waiting 5 seconds
There were Hadoop specific errors (autokill will not kill job), showing at most 30:
java.io.IOException: java.io.IOException: Stream closed
at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:132)
at org.godhuli.rhipe.RHMRMapper.run(RHMRMapper.java:60)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.io.IOException: Stream closed
at java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:434)
at java.io.OutputStream.write(OutputStream.java:116)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)
at java.io.Da[Tue May 20 10:10:25 2014] Name:2014-05-20 10:09:54 Job: job_201405200838_0005  State: RUNNING Duration: 30.411
       pct numtasks pending running complete killed failed_attempts killed_attempts
map      0        1       0       1        0      0               3               0
reduce   0        1       1       0        0      0               0               0
Waiting 5 seconds
There were Hadoop specific errors (autokill will not kill job), showing at most 30:
java.io.IOException: java.io.IOException: Stream closed
at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:132)
at org.godhuli.rhipe.RHMRMapper.run(RHMRMapper.java:60)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.io.IOException: Stream closed
at java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:434)
at java.io.OutputStream.write(OutputStream.java:116)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)
at java.io.Da[Tue May 20 10:10:30 2014] Name:2014-05-20 10:09:54 Job: job_201405200838_0005  State: RUNNING Duration: 35.46
       pct numtasks pending running complete killed failed_attempts killed_attempts
map      0        1       0       1        0      0               3               0
reduce   0        1       1       0        0      0               0               0
Waiting 5 seconds
There were Hadoop specific errors (autokill will not kill job), showing at most 30:
java.io.IOException: java.io.IOException: Stream closed
at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:132)
at org.godhuli.rhipe.RHMRMapper.run(RHMRMapper.java:60)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.io.IOException: Stream closed
at java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:434)
at java.io.OutputStream.write(OutputStream.java:116)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)
at java.io.DaWarning message:
In Rhipe:::rhwatch.runner(job = job, mon.sec = mon.sec, readback = readback,  :
  Job failure, deleting output: /tmp/rhipe-temp-673c4ce8219b78f9b5128a5ad356a1bf:

I'm not sure what you mean by "(and see errors())".

Thanks by the way for you quick responses and help with this!

Luke
TaskCounter instead
2014-05-20 09:20:59,001 INFO org.apache.hadoop.mapred.TaskRunner: Creating symlink: /mapred/local/taskTracker/distcache/708072866974901654_1861982439_439906596/es-cdh-node1/tmp/rhipe-temp-params-419d37f9ea5531712b442f9285e9e5a9 <- /mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/rhipe-temp-params-419d37f9ea5531712b442f9285e9e5a9
2014-05-20 09:20:59,009 INFO org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating symlink: /mapred/local/taskTracker/es/jobcache/job_201405200838_0002/jars/job.jar <- /mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/job.jar
2014-05-20 09:20:59,011 INFO org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating symlink: /mapred/local/taskTracker/es/jobcache/job_201405200838_0002/jars/.job.jar.crc <- /mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/.job.jar.crc
2014-05-20 09:20:59,080 INFO org.apache.hadoop.conf.Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
2014-05-20 09:20:59,082 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics with processName=MAP, sessionId=
2014-05-20 09:20:59,737 INFO org.apache.hadoop.util.ProcessTree: setsid exited with exit code 0
2014-05-20 09:20:59,743 INFO org.apache.hadoop.mapred.Task:  Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@6418984
2014-05-20 09:21:00,185 INFO org.apache.hadoop.mapred.MapTask: Processing split: hdfs://es-cdh-node1:8020/datasets/CNBC/50kRows.tsv:0+18831154
2014-05-20 09:21:00,196 INFO org.apache.hadoop.mapred.MapTask: Map output collector class = org.apache.hadoop.mapred.MapTask$MapOutputBuffer
2014-05-20 09:21:00,200 INFO org.apache.hadoop.mapred.MapTask: io.sort.mb = 256
2014-05-20 09:21:00,270 INFO org.apache.hadoop.mapred.MapTask: data buffer = 204010960/255013696
2014-05-20 09:21:00,270 INFO org.apache.hadoop.mapred.MapTask: record buffer = 671088/838860
2014-05-20 09:21:00,381 INFO org.godhuli.rhipe.RHMRHelper: rhipe_setup_map::writing to file:/mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/rhipe_setup_map
2014-05-20 09:21:00,382 INFO org.godhuli.rhipe.RHMRHelper: rhipe_cleanup_map::writing to file:/mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/rhipe_cleanup_map
2014-05-20 09:21:00,382 INFO org.godhuli.rhipe.RHMRHelper: rhipe_cleanup_reduce::writing to file:/mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/rhipe_cleanup_reduce
2014-05-20 09:21:00,383 INFO org.godhuli.rhipe.RHMRHelper: rhipe_setup_reduce::writing to file:/mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/rhipe_setup_reduce
2014-05-20 09:21:00,385 INFO org.godhuli.rhipe.RHMRHelper: rhipe_reduce_prekey::writing to file:/mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/rhipe_reduce_prekey
2014-05-20 09:21:00,387 INFO org.godhuli.rhipe.RHMRHelper: rhipe_map::writing to file:/mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/rhipe_map
2014-05-20 09:21:00,389 INFO org.godhuli.rhipe.RHMRHelper: rhipe_reduce_postkey::writing to file:/mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/rhipe_reduce_postkey
2014-05-20 09:21:00,395 INFO org.godhuli.rhipe.RHMRHelper: rhipe_reduce::writing to file:/mapred/local/taskTracker/es/jobcache/job_201405200838_0002/attempt_201405200838_0002_m_000000_1/work/rhipe_reduce
2014-05-20 09:21:00,412 INFO org.godhuli.rhipe.RHMRHelper: Mapper:Started external program:/usr/local/bin/R CMD /usr/local/lib64/R/library/Rhipe/bin/RhipeMapReduce --slave --silent --vanilla
2014-05-20 09:21:00,413 INFO org.godhuli.rhipe.RHMRHelper: Mapper:Started Error Thread
2014-05-20 09:21:00,415 INFO org.godhuli.rhipe.RHMRHelper: Mapper:Started Output Thread
2014-05-20 09:21:00,450 INFO org.godhuli.rhipe.RHMRHelper: Mapper:MROutputThread done
2014-05-20 09:21:00,465 INFO org.godhuli.rhipe.RHMRHelper: Mapper:MRErrorThread done
2014-05-20 09:21:00,467 INFO org.godhuli.rhipe.RHMRMapper: QUIIIITING:127
2014-05-20 09:21:00,486 INFO org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs' truncater with mapRetainSize=-1 and reduceRetainSize=-1
2014-05-20 09:21:00,489 WARN org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:es (auth:SIMPLE) cause:java.io.IOException: java.io.IOException: Stream closed
2014-05-20 09:21:00,490 WARN org.apache.hadoop.mapred.Child: Error running child
java.io.IOException: java.io.IOException: Stream closed
	at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:132)
	at org.godhuli.rhipe.RHMRMapper.run(RHMRMapper.java:60)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.io.IOException: Stream closed
	at java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:434)
	at java.io.OutputStream.write(OutputStream.java:116)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)
	at java.io.DataOutputStream.write(DataOutputStream.java:107)
	at org.godhuli.rhipe.RHBytesWritable.write(RHBytesWritable.java:121)
	at org.godhuli.rhipe.RHMRHelper.write(RHMRHelper.java:341)
	at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:126)
	... 8 more
2014-05-20 09:21:00,496 INFO org.apache.hadoop.mapred.Task: Runnning cleanup for the task
...

Saptarshi Guha

unread,
May 20, 2014, 6:22:32 PM5/20/14
to Luke Hallett, rh...@googlegroups.com
I mean can you go to the web UI and see "ALL" for the entire log.
Sometimes RHIPE drops the errors there and doesn't

btw, on the nodes, does RhipeMapReduce run?

one my comp, i do this
 RHIPEWHAT=0 R CMD /usr/local/lib64/R/library/Rhipe/bin/RhipeMapReduce --slave --silent --vanilla

(and then it waits)

Can you try your runner  on the nodes

Luke Hallett

unread,
May 20, 2014, 6:35:20 PM5/20/14
to rh...@googlegroups.com, Luke Hallett, saptars...@gmail.com
When I run that command at the linux command line (that is what you meant right?), I get the following (on both nodes):

[es@es-cdh-node1 ~]$ RHIPEWHAT=0 R CMD /usr/local/lib64/R/library/Rhipe/bin/RhipeMapReduce --slave --silent --vanilla
Segmentation fault (core dumped)
[es@es-cdh-node1 ~]$

[es@es-cdh-node2 bin]$ RHIPEWHAT=0 R CMD /usr/local/lib64/R/library/Rhipe/bin/RhipeMapReduce --slave --silent --vanilla
Segmentation fault (core dumped)
[es@es-cdh-node2 bin]$

(I did verify that the path was correct to RhipeMapReduce).

Do you still want the complete error log from the JobTracker?
...

Saptarshi Guha

unread,
May 20, 2014, 6:37:30 PM5/20/14
to Luke Hallett, rh...@googlegroups.com
Yes, it shouldn't seg fault. I have a feeling the R process is not
even starting on the nodes ...

Luke Hallett

unread,
May 20, 2014, 6:52:41 PM5/20/14
to rh...@googlegroups.com, Luke Hallett, saptars...@gmail.com
ok, I'll focus on getting that command to run properly first.

FWIW, when I run that command on my single-node instance, I get this:

[e...@vm948.dev ~]$ RHIPEWHAT=0 R CMD /usr/local/lib64/R/library/Rhipe/bin/RhipeMapReduce --slave --silent --vanilla
/usr/local/lib64/R/bin/Rcmd: line 62: /usr/local/lib64/R/library/Rhipe/bin/RhipeMapReduce: No such file or directory
/usr/local/lib64/R/bin/Rcmd: line 62: exec: /usr/local/lib64/R/library/Rhipe/bin/RhipeMapReduce: cannot execute: No such file or directory

This is the box where the MR job runs fine from R/Rhipe.  Is there anything I can look at to troubleshoot that further, or does that look ok?

Luke.

Saptarshi Guha

unread,
May 20, 2014, 7:00:34 PM5/20/14
to Luke Hallett, rh...@googlegroups.com
On Tue, May 20, 2014 at 3:52 PM, Luke Hallett <lhal...@gmail.com> wrote:
> /RhipeMapReduce --slave --silent --vanilla
> /usr/local/lib64/R/bin/Rcmd: line 62:
> /usr/local/lib64/R/library/Rhipe/bin/RhipeMapReduce: No such file or
> directory
> /usr/local/lib64/R/bin/Rcmd: line 62: exec:
> /usr/local/lib64/R/library/Rhipe/bin/RhipeMapReduce: cannot execute: No such
> file or directory


You shouldn't be getting those errors even on a single node instance.

Ruizhu Huang

unread,
Apr 27, 2015, 11:15:12 AM4/27/15
to rh...@googlegroups.com, saptars...@gmail.com, lhal...@gmail.com
I have the same issue. 

c252-101.wrangler(86)$ RHIPEWHAT=0 R CMD /home/03076/rhuang/R/lib64/R/library/Rhipe/bin/RhipeMapReduce


?R version 3.1.2 (2014-10-31) -- "Pumpkin Helmet"

Copyright (C) 2014 The R Foundation for Statistical Computing

Platform: x86_64-redhat-linux-gnu (64-bit)


?R is free software and comes with ABSOLUTELY NO WARRANTY.

You are welcome to redistribute it under certain conditions.

Type 'license()' or 'licence()' for distribution details.


?  Natural language support but running in an English locale


R is a collaborative project with many contributors.

Type 'contributors()' for more information and

'citation()' on how to cite R or R packages in publications.


?Type 'demo()' for some demos, 'help()' for on-line help, or

'help.start()' for an HTML browser interface to help.

Type 'q()' to quit R.


Segmentation fault (core dumped)

Ruizhu Huang

unread,
Apr 27, 2015, 2:53:28 PM4/27/15
to rh...@googlegroups.com, lhal...@gmail.com, saptars...@gmail.com
I solved the problem by 

chmod -R 755 $HOME/R/lib64/R/etc

chmod -R 755 $HOME/R/lib64/R/library/Rhipe/

Then everything works fine

Pablo Arias

unread,
Jun 3, 2015, 8:38:22 PM6/3/15
to rh...@googlegroups.com, saptars...@gmail.com, lhal...@gmail.com
Ruizhu, 

Did you have to do this on all the nodes?

Jeremiah Rounds

unread,
Sep 25, 2015, 12:08:31 PM9/25/15
to rhipe, saptars...@gmail.com, lhal...@gmail.com
Maybe  if you are not using a Rhipe distributed cache (for example if you are using a tarball created bashRhipeArchive() in R).

The issue with the above permissions might have been that the executable that runs during MapReduce didn't even have the permissions set to run in the front-end (hence chmod fixed it), but on the worker nodes in MapReduce jobs Rhipe is doing 1 of 2 things:
1) Running the ex already present on the worker node called RhipeMapReduce (once per a Map Reduce task)
2) Running the ex distributed to the worker nodes via a "zip" argument in Rhipe that gets a tar ball out to the workers via Hadoop distributed cache files.


The first you certainly may need to chmod appropriately.  The second it should be handled already.
Reply all
Reply to author
Forward
0 new messages