Rhipe_0.75.0_cdh5mr2 Error: java.io.IOException: java.io.IOException: Stream closed

184 views
Skip to first unread message

Ruizhu Huang

unread,
May 14, 2015, 3:44:04 PM5/14/15
to rh...@googlegroups.com

Loading required package: methods

Loading required package: codetools

Loading required package: rJava

------------------------------------------------

| Please call rhinit() else RHIPE will not run |

------------------------------------------------

Rhipe: Using Rhipe.jar file

Initializing Rhipe v0.75.0

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/usr/lib/hadoop/client/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

2015-05-14 14:08:56,925 WARN  [main][NativeCodeLoader] Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

Initializing mapfile caches

Deleted 1 file

Saving 1 parameter to /tmp/rhipe-temp-params-1cdafc8d34fa580be3d31b7723dcce92 (use rhclean to delete all temp files)

[Thu May 14 14:08:58 2015] Name:rhipe--20-10-FALSE-book.txt Job: job_1431616662370_0013  State: PREP Duration: 0.096

URL: http://c252-101.wrangler.tacc.utexas.edu:8088/proxy/application_1431616662370_0013/

       pct numtasks pending running complete killed failed_attempts

map      0        0       0       0        0      0               0

reduce   0        0       0       0        0      0               0

       killed_attempts

map                  0

reduce               0

Waiting 5 seconds

[Thu May 14 14:09:04 2015] Name:rhipe--20-10-FALSE-book.txt Job: job_1431616662370_0013  State: RUNNING Duration: 5.267

URL: http://c252-101.wrangler.tacc.utexas.edu:8088/proxy/application_1431616662370_0013/

       pct numtasks pending running complete killed failed_attempts

map      0        1       1       0        0      0               0

reduce   0       10      10       0        0      0               0

       killed_attempts

map                  0

reduce               0

Waiting 5 seconds

There were Hadoop specific errors (autokill will not kill job), showing at most 30:

Error: java.io.IOException: java.io.IOException: Stream closed

at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:132)

at org.godhuli.rhipe.RHMRMapper.run(RHMRMapper.java:60)

at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)

at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)

at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:415)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)

at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)

Caused by: java.io.IOException: Stream closed

at java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:434)

at java.io.OutputStream.write(OutputStream.java:116)

at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)

at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)

at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)

at java.io.DataOutputStream.write(DataOutputStream.java:107)

at org.godhuli.rhipe.RHBytesWritable.write(RHBytesWritable.java:121)

at org.godhuli.rhipe.RHMRHelper.write(RHMRHelper.java:341)

at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:126)

... 8 more

[Thu May 14 14:09:09 2015] Name:rhipe--20-10-FALSE-book.txt Job: job_1431616662370_0013  State: RUNNING Duration: 10.404

URL: http://c252-101.wrangler.tacc.utexas.edu:8088/proxy/application_1431616662370_0013/

       pct numtasks pending running complete killed failed_attempts

map      0        1       0       1        0      0               1

reduce   0       10      10       0        0      0               0

       killed_attempts

map                  0

reduce               0

Waiting 5 seconds

There were Hadoop specific errors (autokill will not kill job), showing at most 30:

Error: java.io.IOException: java.io.IOException: Stream closed

at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:132)

at org.godhuli.rhipe.RHMRMapper.run(RHMRMapper.java:60)

at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)

at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)

at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:415)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)

at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)

Caused by: java.io.IOException: Stream closed

at java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:434)

at java.io.OutputStream.write(OutputStream.java:116)

at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)

at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)

at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)

at java.io.DataOutputStream.write(DataOutputStream.java:107)

at org.godhuli.rhipe.RHBytesWritable.write(RHBytesWritable.java:121)

at org.godhuli.rhipe.RHMRHelper.write(RHMRHelper.java:341)

at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:126)

... 8 more

[Thu May 14 14:09:14 2015] Name:rhipe--20-10-FALSE-book.txt Job: job_1431616662370_0013  State: RUNNING Duration: 15.497

URL: http://c252-101.wrangler.tacc.utexas.edu:8088/proxy/application_1431616662370_0013/

       pct numtasks pending running complete killed failed_attempts

map      0        1       0       1        0      0               2

reduce   0       10      10       0        0      0               0

       killed_attempts

map                  0

reduce               0

Waiting 5 seconds

There were Hadoop specific errors (autokill will not kill job), showing at most 30:

Error: java.io.IOException: java.io.IOException: Stream closed

at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:132)

at org.godhuli.rhipe.RHMRMapper.run(RHMRMapper.java:60)

at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)

at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)

at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:415)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)

at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)

Caused by: java.io.IOException: Stream closed

at java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:434)

at java.io.OutputStream.write(OutputStream.java:116)

at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)

at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)

at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)

at java.io.DataOutputStream.write(DataOutputStream.java:107)

at org.godhuli.rhipe.RHBytesWritable.write(RHBytesWritable.java:121)

at org.godhuli.rhipe.RHMRHelper.write(RHMRHelper.java:341)

at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:126)

... 8 more

[Thu May 14 14:09:19 2015] Name:rhipe--20-10-FALSE-book.txt Job: job_1431616662370_0013  State: RUNNING Duration: 20.579

URL: http://c252-101.wrangler.tacc.utexas.edu:8088/proxy/application_1431616662370_0013/

       pct numtasks pending running complete killed failed_attempts

map      1        1       0       0        0      0               4

reduce   1       10       0       0        0     10               0

       killed_attempts

map                  0

reduce               0

Waiting 5 seconds

There were Hadoop specific errors (autokill will not kill job), showing at most 30:

Error: java.io.IOException: java.io.IOException: Stream closed

at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:132)

at org.godhuli.rhipe.RHMRMapper.run(RHMRMapper.java:60)

at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)

at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)

at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:415)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)

at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)

Caused by: java.io.IOException: Stream closed

at java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:434)

at java.io.OutputStream.write(OutputStream.java:116)

at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)

at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)

at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)

at java.io.DataOutputStream.write(DataOutputStream.java:107)

at org.godhuli.rhipe.RHBytesWritable.write(RHBytesWritable.java:121)

at org.godhuli.rhipe.RHMRHelper.write(RHMRHelper.java:341)

at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:126)

... 8 more

Warning message:

In Rhipe:::rhwatch.runner(job = job, mon.sec = mon.sec, readback = readback,  :

  Job failure, deleting output: /user/rhuang/output/rhipe_140857book.txtM20R10:

Ruizhu Huang

unread,
May 14, 2015, 3:48:17 PM5/14/15
to rh...@googlegroups.com
logs


15/05/14 14:45:38 INFO client.RMProxy: Connecting to ResourceManager at c252-101.wrangler.tacc.utexas.edu/129.114.58.144:8032

Container: container_1431616662370_0013_01_000003 on c252-102.wrangler.tacc.utexas.edu_40231

==============================================================================================

LogType: stderr

LogLength: 1394

Log Contents:

java.io.IOException: Stream closed

at java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:434)

at java.io.OutputStream.write(OutputStream.java:116)

at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)

at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)

at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)

at java.io.DataOutputStream.write(DataOutputStream.java:107)

at org.godhuli.rhipe.RHBytesWritable.write(RHBytesWritable.java:121)

at org.godhuli.rhipe.RHMRHelper.write(RHMRHelper.java:341)

at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:126)

at org.godhuli.rhipe.RHMRMapper.run(RHMRMapper.java:60)

at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)

at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)

at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:415)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)

at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)

log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.impl.MetricsSystemImpl).

log4j:WARN Please initialize the log4j system properly.

log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.


LogType: stdout

LogLength: 19194

Log Contents:

Classloader sun.misc.Launcher$AppClassLoader@499a12ee:

file:/hdfs-d5/1/yarn/local/usercache/rhuang/appcache/application_1431616662370_0013/container_1431616662370_0013_01_000003/

file:/hdfs-d5/3/yarn/local/usercache/rhuang/appcache/application_1431616662370_0013/filecache/10/job.jar/job.jar

file:/hdfs-d5/3/yarn/local/usercache/rhuang/appcache/application_1431616662370_0013/filecache/10/job.jar/classes

file:/hdfs-d5/3/yarn/local/usercache/rhuang/appcache/application_1431616662370_0013/filecache/10/job.jar/lib/*

file:/hdfs-d5/3/yarn/local/usercache/rhuang/appcache/application_1431616662370_0013/filecache/10/job.jar/

file:/etc/hadoop/conf.d5/

file:/usr/lib/parquet/lib/parquet-format-2.1.0-cdh5.3.0-sources.jar

file:/usr/lib/hadoop/hadoop-auth-2.5.0-cdh5.3.0.jar

file:/usr/lib/parquet/lib/parquet-protobuf-1.5.0-cdh5.3.0.jar

file:/usr/lib/parquet/lib/parquet-cascading-1.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop/hadoop-auth-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop/hadoop-common-2.5.0-cdh5.3.0-tests.jar

file:/usr/lib/parquet/lib/parquet-hadoop-bundle-1.5.0-cdh5.3.0.jar

file:/usr/lib/parquet/lib/parquet-hadoop-1.5.0-cdh5.3.0.jar

file:/usr/lib/parquet/lib/parquet-common-1.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop/hadoop-nfs-2.5.0-cdh5.3.0.jar

file:/usr/lib/parquet/lib/parquet-encoding-1.5.0-cdh5.3.0.jar

file:/usr/lib/parquet/lib/parquet-scala_2.10-1.5.0-cdh5.4.0-SNAPSHOT.jar

file:/usr/lib/parquet/lib/parquet-scrooge_2.10-1.5.0-cdh5.3.0.jar

file:/usr/lib/parquet/lib/parquet-generator-1.5.0-cdh5.3.0.jar

file:/usr/lib/parquet/lib/parquet-pig-1.5.0-cdh5.3.0.jar

file:/usr/lib/parquet/lib/parquet-test-hadoop2-1.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop/hadoop-common-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop/hadoop-annotations-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop/hadoop-aws-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop/hadoop-common-2.5.0-cdh5.3.0.jar

file:/usr/lib/parquet/lib/parquet-jackson-1.5.0-cdh5.3.0.jar

file:/usr/lib/parquet/lib/parquet-avro-1.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop/hadoop-nfs-2.5.0-cdh5.3.0.jar

file:/usr/lib/parquet/lib/parquet-tools-1.5.0-cdh5.3.0.jar

file:/usr/lib/parquet/lib/parquet-column-1.5.0-cdh5.3.0.jar

file:/usr/lib/parquet/lib/parquet-pig-bundle-1.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop/hadoop-aws-2.5.0-cdh5.3.0.jar

file:/usr/lib/parquet/lib/parquet-format-2.1.0-cdh5.3.0.jar

file:/usr/lib/parquet/lib/parquet-thrift-1.5.0-cdh5.3.0.jar

file:/usr/lib/parquet/lib/parquet-format-2.1.0-cdh5.3.0-javadoc.jar

file:/usr/lib/hadoop/hadoop-annotations-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar

file:/usr/lib/avro/avro-1.7.6-cdh5.3.0.jar

file:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar

file:/usr/lib/hadoop/lib/activation-1.1.jar

file:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar

file:/usr/lib/hadoop/lib/curator-recipes-2.6.0.jar

file:/usr/lib/hadoop/lib/jsr305-1.3.9.jar

file:/usr/lib/hadoop/lib/commons-digester-1.8.jar

file:/usr/lib/hadoop/lib/paranamer-2.3.jar

file:/usr/lib/zookeeper/zookeeper-3.4.5-cdh5.3.0.jar

file:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar

file:/usr/lib/hadoop/lib/jsp-api-2.1.jar

file:/usr/lib/hadoop/lib/xmlenc-0.52.jar

file:/usr/lib/hadoop/lib/jackson-mapper-asl-1.8.8.jar

file:/usr/lib/hadoop/lib/commons-net-3.1.jar

file:/usr/lib/hadoop/lib/gson-2.2.4.jar

file:/usr/lib/hadoop/lib/jackson-xc-1.8.8.jar

file:/usr/lib/hadoop/lib/servlet-api-2.5.jar

file:/usr/lib/hadoop/lib/commons-cli-1.2.jar

file:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar

file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar

file:/usr/lib/hadoop/lib/jersey-core-1.9.jar

file:/usr/lib/hadoop/lib/jersey-json-1.9.jar

file:/usr/lib/hadoop/lib/curator-framework-2.6.0.jar

file:/usr/lib/hadoop/lib/junit-4.11.jar

file:/usr/lib/hadoop/lib/jackson-core-asl-1.8.8.jar

file:/usr/lib/hadoop/lib/commons-lang-2.6.jar

file:/usr/lib/hadoop/lib/log4j-1.2.17.jar

file:/usr/lib/hadoop/lib/jackson-jaxrs-1.8.8.jar

file:/usr/lib/hadoop/lib/xz-1.0.jar

file:/usr/lib/hadoop/lib/httpclient-4.2.5.jar

file:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar

file:/usr/lib/hadoop/lib/jetty-util-6.1.26.cloudera.4.jar

file:/usr/lib/hadoop/lib/jets3t-0.9.0.jar

file:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar

file:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar

file:/usr/lib/hadoop/lib/asm-3.2.jar

file:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar

file:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar

file:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar

file:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar

file:/usr/lib/hadoop/lib/commons-collections-3.2.1.jar

file:/usr/lib/hadoop/lib/aws-java-sdk-1.7.4.jar

file:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar

file:/usr/lib/hadoop/lib/commons-io-2.4.jar

file:/usr/lib/hadoop/lib/jsch-0.1.42.jar

file:/usr/lib/hadoop/lib/jersey-server-1.9.jar

file:/usr/lib/hadoop/lib/jetty-6.1.26.cloudera.4.jar

file:/usr/lib/hadoop/lib/curator-client-2.6.0.jar

file:/usr/lib/hadoop/lib/stax-api-1.0-2.jar

file:/usr/lib/hadoop/lib/jettison-1.1.jar

file:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar

file:/usr/lib/hadoop/lib/jasper-compiler-5.5.23.jar

file:/usr/lib/hadoop/lib/guava-11.0.2.jar

file:/usr/lib/hadoop/lib/commons-codec-1.4.jar

file:/usr/lib/hadoop/lib/jasper-runtime-5.5.23.jar

file:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar

file:/usr/lib/hadoop/lib/httpcore-4.2.5.jar

file:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar

file:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar

file:/usr/lib/hadoop/lib/commons-el-1.0.jar

file:/usr/lib/hadoop/lib/slf4j-api-1.7.5.jar

file:/usr/lib/hadoop/lib/commons-configuration-1.6.jar

file:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar

file:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.5.0-cdh5.3.0-tests.jar

file:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar

file:/usr/lib/hadoop-hdfs/lib/jsr305-1.3.9.jar

file:/usr/lib/hadoop-hdfs/lib/jsp-api-2.1.jar

file:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar

file:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.8.8.jar

file:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar

file:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar

file:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar

file:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.8.8.jar

file:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar

file:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar

file:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.cloudera.4.jar

file:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar

file:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar

file:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar

file:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar

file:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.cloudera.4.jar

file:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar

file:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar

file:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar

file:/usr/lib/hadoop-hdfs/lib/jasper-runtime-5.5.23.jar

file:/usr/lib/hadoop-hdfs/lib/commons-el-1.0.jar

file:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar

file:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/hadoop-extras-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/hadoop-auth-2.5.0-cdh5.3.0.jar

file:/usr/lib/avro/avro-1.7.6-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.5.0-cdh5.3.0-tests.jar

file:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar

file:/usr/lib/hadoop-mapreduce/activation-1.1.jar

file:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/hadoop-auth-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar

file:/usr/lib/hadoop-mapreduce/curator-recipes-2.6.0.jar

file:/usr/lib/hadoop-mapreduce/jsr305-1.3.9.jar

file:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar

file:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-nativetask-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar

file:/usr/lib/zookeeper/zookeeper-3.4.5-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/hadoop-sls-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/hadoop-extras-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/hadoop-azure-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar

file:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar

file:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar

file:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.8.8.jar

file:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar

file:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar

file:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar

file:/usr/lib/hadoop-mapreduce/jackson-xc-1.8.8.jar

file:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar

file:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar

file:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar

file:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar

file:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar

file:/usr/lib/hadoop-mapreduce/curator-framework-2.6.0.jar

file:/usr/lib/hadoop-mapreduce/junit-4.11.jar

file:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/joda-time-1.6.jar

file:/usr/lib/hadoop-mapreduce/hadoop-archives-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.8.8.jar

file:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/hadoop-azure-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar

file:/usr/lib/hadoop-mapreduce/hadoop-sls-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar

file:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.8.8.jar

file:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar

file:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/xz-1.0.jar

file:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.5.0-cdh5.3.0-tests.jar

file:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar

file:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar

file:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.cloudera.4.jar

file:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar

file:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar

file:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar

file:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/asm-3.2.jar

file:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar

file:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar

file:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar

file:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar

file:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar

file:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-nativetask-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/commons-collections-3.2.1.jar

file:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar

file:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar

file:/usr/lib/hadoop-mapreduce/hadoop-archives-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar

file:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar

file:/usr/lib/hadoop-mapreduce/jetty-6.1.26.cloudera.4.jar

file:/usr/lib/hadoop-mapreduce/curator-client-2.6.0.jar

file:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar

file:/usr/lib/hadoop-mapreduce/jettison-1.1.jar

file:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar

file:/usr/lib/hadoop-mapreduce/jasper-compiler-5.5.23.jar

file:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar

file:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar

file:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/jasper-runtime-5.5.23.jar

file:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar

file:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar

file:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar

file:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar

file:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar

file:/usr/lib/hadoop-mapreduce/commons-el-1.0.jar

file:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar

file:/usr/lib/hadoop-mapreduce/microsoft-windowsazure-storage-sdk-0.6.0.jar

file:/usr/lib/avro/avro-1.7.6-cdh5.3.0.jar

file:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar

file:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar

file:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar

file:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar

file:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar

file:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.8.8.jar

file:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar

file:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar

file:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar

file:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar

file:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.8.8.jar

file:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar

file:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar

file:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar

file:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar

file:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar

file:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar

file:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar

file:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar

file:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar

file:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar

file:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.5.0-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar

file:/usr/lib/hadoop-yarn/lib/activation-1.1.jar

file:/usr/lib/hadoop-yarn/lib/jsr305-1.3.9.jar

file:/usr/lib/zookeeper/zookeeper-3.4.5-cdh5.3.0.jar

file:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar

file:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.8.8.jar

file:/usr/lib/hadoop-yarn/lib/guice-3.0.jar

file:/usr/lib/hadoop-yarn/lib/jackson-xc-1.8.8.jar

file:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar

file:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar

file:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar

file:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar

file:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar

file:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.8.8.jar

file:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar

file:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar

file:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.8.8.jar

file:/usr/lib/hadoop-yarn/lib/xz-1.0.jar

file:/usr/lib/hadoop-yarn/lib/commons-httpclient-3.1.jar

file:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar

file:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.cloudera.4.jar

file:/usr/lib/hadoop-yarn/lib/asm-3.2.jar

file:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar

file:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar

file:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar

file:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar

file:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.1.jar

file:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar

file:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar

file:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.cloudera.4.jar

file:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar

file:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar

file:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar

file:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar

file:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar

file:/usr/lib/hadoop-yarn/lib/jline-0.9.94.jar

file:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar

file:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar

file:/usr/lib/hadoop-mapreduce/share/hadoop/mapreduce/*

file:/usr/lib/hadoop-mapreduce/share/hadoop/mapreduce/lib/*

Classloader sun.misc.Launcher$ExtClassLoader@2250ed02:

file:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.75.x86_64/jre/lib/ext/sunjce_provider.jar

file:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.75.x86_64/jre/lib/ext/zipfs.jar

file:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.75.x86_64/jre/lib/ext/localedata.jar

file:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.75.x86_64/jre/lib/ext/dnsns.jar

file:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.75.x86_64/jre/lib/ext/gnome-java-bridge.jar

file:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.75.x86_64/jre/lib/ext/sunpkcs11.jar

file:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.75.x86_64/jre/lib/ext/pulse-java.jar

arg count: 6

/opt/apps/intel15/mvapich2_2_1/Rstats/3.1.3/lib64/R/bin/R

CMD

/opt/apps/intel15/mvapich2_2_1/big-data-r/3.1.3/r-library/Rhipe/bin/RhipeMapReduce

--slave

--silent

--vanilla

total bytes of args:166


LogType: syslog

LogLength: 9867

Log Contents:

2015-05-14 14:09:08,823 WARN [main] org.apache.hadoop.conf.Configuration: job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval;  Ignoring.

2015-05-14 14:09:08,849 WARN [main] org.apache.hadoop.conf.Configuration: job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts;  Ignoring.

2015-05-14 14:09:09,128 INFO [main] org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties

2015-05-14 14:09:09,184 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).

2015-05-14 14:09:09,184 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system started

2015-05-14 14:09:09,192 INFO [main] org.apache.hadoop.mapred.YarnChild: Executing with tokens:

2015-05-14 14:09:09,192 INFO [main] org.apache.hadoop.mapred.YarnChild: Kind: mapreduce.job, Service: job_1431616662370_0013, Ident: (org.apache.hadoop.mapreduce.security.token.JobTokenIdentifier@220ca8ce)

2015-05-14 14:09:09,255 INFO [main] org.apache.hadoop.mapred.YarnChild: Sleeping for 0ms before retrying again. Got null now.

2015-05-14 14:09:09,437 INFO [main] org.apache.hadoop.mapred.YarnChild: mapreduce.cluster.local.dir for child: /hdfs-d5/1/yarn/local/usercache/rhuang/appcache/application_1431616662370_0013,/hdfs-d5/2/yarn/local/usercache/rhuang/appcache/application_1431616662370_0013,/hdfs-d5/3/yarn/local/usercache/rhuang/appcache/application_1431616662370_0013,/hdfs-d5/4/yarn/local/usercache/rhuang/appcache/application_1431616662370_0013

2015-05-14 14:09:09,531 WARN [main] org.apache.hadoop.conf.Configuration: job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval;  Ignoring.

2015-05-14 14:09:09,539 WARN [main] org.apache.hadoop.conf.Configuration: job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts;  Ignoring.

2015-05-14 14:09:09,752 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id

2015-05-14 14:09:10,111 INFO [main] org.apache.hadoop.mapred.Task:  Using ResourceCalculatorProcessTree : [ ]

2015-05-14 14:09:10,253 WARN [main] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 14717

2015-05-14 14:09:10,253 WARN [main] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 14718

2015-05-14 14:09:10,253 WARN [main] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 14720

2015-05-14 14:09:10,253 WARN [main] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 14721

2015-05-14 14:09:10,253 WARN [main] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 14723

2015-05-14 14:09:10,253 WARN [main] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 14724

2015-05-14 14:09:10,253 WARN [main] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 14726

2015-05-14 14:09:10,254 WARN [main] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 14727

2015-05-14 14:09:10,384 INFO [main] org.apache.hadoop.mapred.MapTask: Processing split: hdfs://c252-101.wrangler.tacc.utexas.edu:8020/user/rhuang/data/book.txt:0+400115

2015-05-14 14:09:10,771 INFO [main] org.apache.hadoop.mapred.MapTask: (EQUATOR) 0 kvi 402653180(1610612720)

2015-05-14 14:09:10,772 INFO [main] org.apache.hadoop.mapred.MapTask: mapreduce.task.io.sort.mb: 1536

2015-05-14 14:09:10,772 INFO [main] org.apache.hadoop.mapred.MapTask: soft limit at 1288490240

2015-05-14 14:09:10,772 INFO [main] org.apache.hadoop.mapred.MapTask: bufstart = 0; bufvoid = 1610612736

2015-05-14 14:09:10,772 INFO [main] org.apache.hadoop.mapred.MapTask: kvstart = 402653180; length = 100663296

2015-05-14 14:09:10,777 INFO [main] org.apache.hadoop.mapred.MapTask: Map output collector class = org.apache.hadoop.mapred.MapTask$MapOutputBuffer

2015-05-14 14:09:10,779 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: mapred.linerecordreader.maxlength is deprecated. Instead, use mapreduce.input.linerecordreader.line.maxlength

2015-05-14 14:09:10,798 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces

2015-05-14 14:09:10,799 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: io.sort.mb is deprecated. Instead, use mapreduce.task.io.sort.mb

2015-05-14 14:09:10,807 INFO [main] org.godhuli.rhipe.RHMRHelper: rhipe_setup_map::writing to file:/hdfs-d5/1/yarn/local/usercache/rhuang/appcache/application_1431616662370_0013/container_1431616662370_0013_01_000003/rhipe_setup_map

2015-05-14 14:09:10,808 INFO [main] org.godhuli.rhipe.RHMRHelper: rhipe_cleanup_map::writing to file:/hdfs-d5/1/yarn/local/usercache/rhuang/appcache/application_1431616662370_0013/container_1431616662370_0013_01_000003/rhipe_cleanup_map

2015-05-14 14:09:10,808 INFO [main] org.godhuli.rhipe.RHMRHelper: rhipe_cleanup_reduce::writing to file:/hdfs-d5/1/yarn/local/usercache/rhuang/appcache/application_1431616662370_0013/container_1431616662370_0013_01_000003/rhipe_cleanup_reduce

2015-05-14 14:09:10,808 INFO [main] org.godhuli.rhipe.RHMRHelper: rhipe_setup_reduce::writing to file:/hdfs-d5/1/yarn/local/usercache/rhuang/appcache/application_1431616662370_0013/container_1431616662370_0013_01_000003/rhipe_setup_reduce

2015-05-14 14:09:10,808 INFO [main] org.godhuli.rhipe.RHMRHelper: rhipe_reduce_prekey::writing to file:/hdfs-d5/1/yarn/local/usercache/rhuang/appcache/application_1431616662370_0013/container_1431616662370_0013_01_000003/rhipe_reduce_prekey

2015-05-14 14:09:10,809 INFO [main] org.godhuli.rhipe.RHMRHelper: rhipe_map::writing to file:/hdfs-d5/1/yarn/local/usercache/rhuang/appcache/application_1431616662370_0013/container_1431616662370_0013_01_000003/rhipe_map

2015-05-14 14:09:10,809 INFO [main] org.godhuli.rhipe.RHMRHelper: rhipe_reduce_postkey::writing to file:/hdfs-d5/1/yarn/local/usercache/rhuang/appcache/application_1431616662370_0013/container_1431616662370_0013_01_000003/rhipe_reduce_postkey

2015-05-14 14:09:10,811 INFO [main] org.godhuli.rhipe.RHMRHelper: rhipe_reduce::writing to file:/hdfs-d5/1/yarn/local/usercache/rhuang/appcache/application_1431616662370_0013/container_1431616662370_0013_01_000003/rhipe_reduce

2015-05-14 14:09:10,819 INFO [main] org.godhuli.rhipe.RHMRHelper: Mapper:Started external program:/opt/apps/intel15/mvapich2_2_1/Rstats/3.1.3/lib64/R/bin/R CMD /opt/apps/intel15/mvapich2_2_1/big-data-r/3.1.3/r-library/Rhipe/bin/RhipeMapReduce --slave --silent --vanilla

2015-05-14 14:09:10,819 INFO [main] org.godhuli.rhipe.RHMRHelper: Mapper:Started Error Thread

2015-05-14 14:09:10,821 INFO [main] org.godhuli.rhipe.RHMRHelper: Mapper:Started Output Thread

2015-05-14 14:09:10,832 INFO [Thread-12] org.godhuli.rhipe.RHMRHelper: Mapper:MROutputThread done

2015-05-14 14:09:10,832 INFO [Thread-11] org.godhuli.rhipe.RHMRHelper: Mapper:MRErrorThread done

2015-05-14 14:09:10,844 INFO [main] org.godhuli.rhipe.RHMRMapper: QUIIIITING:127

2015-05-14 14:09:10,846 INFO [main] org.apache.hadoop.mapred.MapTask: Starting flush of map output

2015-05-14 14:09:10,854 INFO [main] org.apache.hadoop.io.compress.zlib.ZlibFactory: Successfully loaded & initialized native-zlib library

2015-05-14 14:09:10,855 INFO [main] org.apache.hadoop.io.compress.CodecPool: Got brand-new compressor [.deflate]

2015-05-14 14:09:10,862 WARN [main] org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:rhuang (auth:SIMPLE) cause:java.io.IOException: java.io.IOException: Stream closed

2015-05-14 14:09:10,862 WARN [main] org.apache.hadoop.mapred.YarnChild: Exception running child : java.io.IOException: java.io.IOException: Stream closed

at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:132)

at org.godhuli.rhipe.RHMRMapper.run(RHMRMapper.java:60)

at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)

at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)

at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:415)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)

at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)

Caused by: java.io.IOException: Stream closed

at java.lang.ProcessBuilder$NullOutputStream.write(ProcessBuilder.java:434)

at java.io.OutputStream.write(OutputStream.java:116)

at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)

at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)

at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)

at java.io.DataOutputStream.write(DataOutputStream.java:107)

at org.godhuli.rhipe.RHBytesWritable.write(RHBytesWritable.java:121)

at org.godhuli.rhipe.RHMRHelper.write(RHMRHelper.java:341)

at org.godhuli.rhipe.RHMRMapper.map(RHMRMapper.java:126)

... 8 more


2015-05-14 14:09:10,865 INFO [main] org.apache.hadoop.mapred.Task: Runnning cleanup for the task

2015-05-14 14:09:10,868 WARN [main] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not delete hdfs://c252-101.wrangler.tacc.utexas.edu:8020/user/rhuang/output/rhipe_140857book.txtM20R10/_temporary/1/_temporary/attempt_1431616662370_0013_m_000000_1


Ruizhu Huang

unread,
May 20, 2015, 12:34:44 PM5/20/15
to rh...@googlegroups.com
Finally figure out:

mapred = list(

LD_LIBRARY_PATH=paste(Sys.getenv(PROTOBUF_LIB),Sys.getenv("TACC_MKL_LIB"),Sys.getenv("ICC_LIB"),sep=':'))



On Thursday, May 14, 2015 at 2:44:04 PM UTC-5, Ruizhu Huang wrote:
Reply all
Reply to author
Forward
0 new messages