jj@jj-VirtualBox:~/softwares/R/rmr2-master/build$ sudo R CMD check rmr2_2.0.2.tar.gz
[sudo] password for jj:
* using log directory ‘/home/jj/softwares/R/rmr2-master/build/rmr2.Rcheck’
* using R version 2.15.2 (2012-10-26)
* using platform: i686-pc-linux-gnu (32-bit)
* using session charset: UTF-8
* checking for file ‘rmr2/DESCRIPTION’ ... OK
* checking extension type ... Package
* this is package ‘rmr2’ version ‘2.0.2’
* checking package namespace information ... OK
* checking package dependencies ... OK
* checking if this is a source package ... OK
* checking if there is a namespace ... OK
* checking for executable files ... OK
* checking whether package ‘rmr2’ can be installed ... OK
* checking installed package size ... OK
* checking package directory ... OK
* checking for portable file names ... OK
* checking for sufficient/correct file permissions ... OK
* checking DESCRIPTION meta-information ... OK
* checking top-level files ... OK
* checking for left-over files ... OK
* checking index information ... OK
* checking package subdirectories ... OK
* checking R files for non-ASCII characters ... OK
* checking R files for syntax errors ... OK
* checking whether the package can be loaded ... OK
* checking whether the package can be loaded with stated dependencies ... OK
* checking whether the package can be unloaded cleanly ... OK
* checking whether the namespace can be loaded with stated dependencies ... OK
* checking whether the namespace can be unloaded cleanly ... OK
* checking loading without being on the library search path ... OK
* checking for unstated dependencies in R code ... OK
* checking S3 generic/method consistency ... OK
* checking replacement functions ... OK
* checking foreign function calls ... OK
* checking R code for possible problems ... NOTE
is.keyval: no visible binding for global variable ‘key’
is.keyval: no visible binding for global variable ‘val’
* checking Rd files ... OK
* checking Rd metadata ... OK
* checking Rd cross-references ... OK
* checking for missing documentation entries ... OK
* checking for code/documentation mismatches ... OK
* checking Rd \usage sections ... OK
* checking Rd contents ... OK
* checking for unstated dependencies in examples ... OK
* checking line endings in C/C++/Fortran sources/headers ... OK
* checking line endings in Makefiles ... WARNING
Found the following Makefiles with CR or CRLF line endings:
src/Makevars
Some Unix ‘make’ programs require LF line endings.
* checking for portable compilation flags in Makevars ... OK
* checking for portable use of $(BLAS_LIBS) and $(LAPACK_LIBS) ... OK
* checking compiled code ... NOTE
File ‘/home/jj/softwares/R/rmr2-master/build/rmr2.Rcheck/rmr2/libs/rmr2.so’:
Found ‘_ZSt4cerr’, possibly from ‘std::cerr’ (C++)
Object: ‘typed-bytes.o’
Compiled code should not call functions which might terminate R nor
write to stdout/stderr instead of to the console.
See ‘Writing portable packages’ in the ‘Writing R Extensions’ manual.
* checking examples ... ERROR
Running examples in ‘rmr2-Ex.R’ failed
The error most likely occurred in:
> ### Name: dfs.empty
> ### Title: Get a directory or file size or check if it is empty
> ### Aliases: dfs.empty dfs.size
>
> ### ** Examples
>
> dfs.empty(mapreduce(to.dfs(1:10)))
13/02/28 14:16:44 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
13/02/28 14:16:44 INFO compress.CodecPool: Got brand-new compressor [.deflate]
13/02/28 14:16:46 WARN streaming.StreamJob: -file option is deprecated, please use generic option -files instead.
packageJobJar: [/tmp/RtmpRVqx3A/rmr-local-envf7313172ceb, /tmp/RtmpRVqx3A/rmr-global-envf73268da2cc, /tmp/RtmpRVqx3A/rmr-streaming-mapf733b6dd658] [/usr/lib/hadoop-mapreduce/hadoop-streaming-2.0.2-alpha.jar] /tmp/streamjob4595777032435723960.jar tmpDir=null
13/02/28 14:16:48 INFO service.AbstractService: Service:org.apache.hadoop.yarn.client.YarnClientImpl is inited.
13/02/28 14:16:48 INFO service.AbstractService: Service:org.apache.hadoop.yarn.client.YarnClientImpl is started.
13/02/28 14:16:48 INFO service.AbstractService: Service:org.apache.hadoop.yarn.client.YarnClientImpl is inited.
13/02/28 14:16:48 INFO service.AbstractService: Service:org.apache.hadoop.yarn.client.YarnClientImpl is started.
13/02/28 14:16:49 INFO mapred.FileInputFormat: Total input paths to process : 1
13/02/28 14:16:50 INFO mapreduce.JobSubmitter: number of splits:2
13/02/28 14:16:50 WARN conf.Configuration: mapred.jar is deprecated. Instead, use mapreduce.job.jar
13/02/28 14:16:50 WARN conf.Configuration: mapred.cache.files is deprecated. Instead, use mapreduce.job.cache.files
13/02/28 14:16:50 WARN conf.Configuration: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces
13/02/28 14:16:50 WARN conf.Configuration: mapred.output.value.class is deprecated. Instead, use mapreduce.job.output.value.class
13/02/28 14:16:50 WARN conf.Configuration: mapred.mapoutput.value.class is deprecated. Instead, use mapreduce.map.output.value.class
13/02/28 14:16:50 WARN conf.Configuration:
mapred.job.name is deprecated. Instead, use
mapreduce.job.name13/02/28 14:16:50 WARN conf.Configuration: mapred.input.dir is deprecated. Instead, use mapreduce.input.fileinputformat.inputdir
13/02/28 14:16:50 WARN conf.Configuration: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
13/02/28 14:16:50 WARN conf.Configuration: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
13/02/28 14:16:50 WARN conf.Configuration: mapred.cache.files.timestamps is deprecated. Instead, use mapreduce.job.cache.files.timestamps
13/02/28 14:16:50 WARN conf.Configuration: mapred.output.key.class is deprecated. Instead, use mapreduce.job.output.key.class
13/02/28 14:16:50 WARN conf.Configuration: mapred.mapoutput.key.class is deprecated. Instead, use mapreduce.map.output.key.class
13/02/28 14:16:50 WARN conf.Configuration: mapred.working.dir is deprecated. Instead, use mapreduce.job.working.dir
13/02/28 14:16:50 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1362021180419_0001
13/02/28 14:16:51 INFO client.YarnClientImpl: Submitted application application_1362021180419_0001 to ResourceManager at /
0.0.0.0:803213/02/28 14:16:51 INFO mapreduce.Job: The url to track the job:
http://localhost:8088/proxy/application_1362021180419_0001/13/02/28 14:16:51 INFO mapreduce.Job: Running job: job_1362021180419_0001
13/02/28 14:17:16 INFO mapreduce.Job: Job job_1362021180419_0001 running in uber mode : false
13/02/28 14:17:16 INFO mapreduce.Job: map 0% reduce 0%
13/02/28 14:17:40 INFO mapreduce.Job: map 50% reduce 0%
13/02/28 14:17:48 INFO mapreduce.Job: map 0% reduce 0%
13/02/28 14:17:48 INFO mapreduce.Job: Task Id : attempt_1362021180419_0001_m_000001_0, Status : FAILED
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:400)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:335)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:157)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1367)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:152)
13/02/28 14:17:48 INFO mapreduce.Job: Task Id : attempt_1362021180419_0001_m_000000_0, Status : FAILED
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:400)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:335)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:157)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1367)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:152)
13/02/28 14:18:10 INFO mapreduce.Job: map 50% reduce 0%
13/02/28 14:18:14 INFO mapreduce.Job: Task Id : attempt_1362021180419_0001_m_000001_1, Status : FAILED
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:400)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:335)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:157)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1367)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:152)
13/02/28 14:18:15 INFO mapreduce.Job: Task Id : attempt_1362021180419_0001_m_000000_1, Status : FAILED
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:400)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:335)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:157)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1367)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:152)
13/02/28 14:18:16 INFO mapreduce.Job: map 0% reduce 0%
13/02/28 14:18:35 INFO mapreduce.Job: map 50% reduce 0%
13/02/28 14:18:38 INFO mapreduce.Job: Task Id : attempt_1362021180419_0001_m_000000_2, Status : FAILED
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:400)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:335)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:157)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1367)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:152)
13/02/28 14:18:38 INFO mapreduce.Job: Task Id : attempt_1362021180419_0001_m_000001_2, Status : FAILED
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:400)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:335)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:157)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1367)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:152)
13/02/28 14:18:39 INFO mapreduce.Job: map 0% reduce 0%
13/02/28 14:19:06 INFO mapreduce.Job: map 50% reduce 0%
13/02/28 14:19:11 INFO mapreduce.Job: map 0% reduce 0%
13/02/28 14:19:11 INFO mapreduce.Job: map 50% reduce 0%
13/02/28 14:19:11 INFO mapreduce.Job: Job job_1362021180419_0001 failed with state FAILED due to:
13/02/28 14:19:12 INFO mapreduce.Job: Counters: 29
File System Counters
FILE: Number of bytes read=120
FILE: Number of bytes written=70902
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=272
HDFS: Number of bytes written=122
HDFS: Number of read operations=4
HDFS: Number of large read operations=0
HDFS: Number of write operations=1
Job Counters
Failed map tasks=7
Launched map tasks=8
Other local map tasks=6
Rack-local map tasks=2
Total time spent by all maps in occupied slots (ms)=187378
Total time spent by all reduces in occupied slots (ms)=0
Map-Reduce Framework
Map input records=0
Map output records=0
Input split bytes=104
Spilled Records=0
Failed Shuffles=0
Merged Map outputs=0
GC time elapsed (ms)=271
CPU time spent (ms)=2260
Physical memory (bytes) snapshot=69914624
Virtual memory (bytes) snapshot=425742336
Total committed heap usage (bytes)=16252928
File Input Format Counters
Bytes Read=168
File Output Format Counters
Bytes Written=122
13/02/28 14:19:12 ERROR streaming.StreamJob: Job not Successful!
Streaming Command Failed!
Error in mr(map = map, reduce = reduce, combine = combine, in.folder = if (is.list(input)) { :
hadoop streaming failed with error code 1
Calls: dfs.empty -> dfs.size -> to.dfs.path -> mapreduce -> mr
Execution halted
jj@jj-VirtualBox:~/softwares/R/rmr2-master/build$