> library(Rhipe)
------------------------------------------------
| Please call rhinit() else RHIPE will not run |
------------------------------------------------
> rhinit()
Rhipe: Using Rhipe.jar file
Initializing Rhipe v0.73
15/03/11 03:39:48 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized
Initializing mapfile caches
the mapreduce code is :
- Map(function(k,v){
- X <- runinf(v)
-
- rhcollect(k, c(Min=min(X),Max=max(x)))
- })
- job =rhwatch(map=Map, input=10,reduce=0,
- output="./output/Phipe/test",jobname='test'
but I can not run it well
Loading required package: codetools
Saving 2 paramaters to /tmp/rhipe-temp-params-a72e6592b72eb53ab4a53f6187012d11 (use rhclean to delete all temp files)
15/03/11 03:51:29 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
/./output/Phipe/test
15/03/11 03:51:29 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized
15/03/11 03:51:30 INFO filecache.TrackerDistributedCacheManager: Creating rhipe-temp-params-a72e6592b72eb53ab4a53f6187012d11 in /tmp/hadoop-hadoop1/mapred/local/archive/7149779269302380530_877075559_141947392/file/tmp-work-1782392964810640082 with rwxr-xr-x
15/03/11 03:51:30 INFO filecache.TrackerDistributedCacheManager: Cached /tmp/rhipe-temp-params-a72e6592b72eb53ab4a53f6187012d11#rhipe-temp-params-a72e6592b72eb53ab4a53f6187012d11 as /tmp/hadoop-hadoop1/mapred/local/archive/7149779269302380530_877075559_141947392/file/tmp/rhipe-temp-params-a72e6592b72eb53ab4a53f6187012d11
15/03/11 03:51:30 INFO filecache.TrackerDistributedCacheManager: Cached /tmp/rhipe-temp-params-a72e6592b72eb53ab4a53f6187012d11#rhipe-temp-params-a72e6592b72eb53ab4a53f6187012d11 as /tmp/hadoop-hadoop1/mapred/local/archive/7149779269302380530_877075559_141947392/file/tmp/rhipe-temp-params-a72e6592b72eb53ab4a53f6187012d11
15/03/11 03:51:30 WARN mapred.LocalJobRunner: LocalJobRunner does not support symlinking into current working dir.
15/03/11 03:51:30 INFO mapred.TaskRunner: Creating symlink: /tmp/hadoop-hadoop1/mapred/local/archive/7149779269302380530_877075559_141947392/file/tmp/rhipe-temp-params-a72e6592b72eb53ab4a53f6187012d11 <- /tmp/hadoop-hadoop1/mapred/local/localRunner/rhipe-temp-params-a72e6592b72eb53ab4a53f6187012d11
15/03/11 03:51:30 WARN mapred.TaskRunner: Failed to create symlink: /tmp/hadoop-hadoop1/mapred/local/archive/7149779269302380530_877075559_141947392/file/tmp/rhipe-temp-params-a72e6592b72eb53ab4a53f6187012d11 <- /tmp/hadoop-hadoop1/mapred/local/localRunner/rhipe-temp-params-a72e6592b72eb53ab4a53f6187012d11
java.lang.ArrayIndexOutOfBoundsException: 1
at org.godhuli.rhipe.RHMR.runasync(RHMR.java:302)
at org.godhuli.rhipe.RHMR.submitAndMonitorJob(RHMR.java:321)
at org.godhuli.rhipe.RHMR.run(RHMR.java:150)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.godhuli.rhipe.RHMR.fmain(RHMR.java:101)
at org.godhuli.rhipe.PersonalServer.rhex(PersonalServer.java:166)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at RJavaTools.invokeMethod(RJavaTools.java:386)
Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl, :
java.lang.ArrayIndexOutOfBoundsException: 1
15/03/11 03:51:30 ERROR mapred.FileOutputCommitter: Mkdirs failed to create /output/Phipe/test/_temporary
15/03/11 03:51:30 INFO mapred.LocalJobRunner: Waiting for map tasks
15/03/11 03:51:30 INFO mapred.LocalJobRunner: Starting task: attempt_local_0001_m_000000_0
15/03/11 03:51:30 INFO util.ProcessTree: setsid exited with exit code 0
15/03/11 03:51:30 INFO mapred.Task: Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@219ba640
15/03/11 03:51:30 INFO mapred.MapTask: Processing split: org.godhuli.rhipe.LApplyInputFormat$LApplyInputSplit@1f57ea4a
15/03/11 03:51:31 INFO mapred.LocalJobRunner: Map task executor complete.
15/03/11 03:51:31 WARN mapred.LocalJobRunner: job_local_0001
java.lang.Exception: java.io.IOException: Mkdirs failed to create file:/output/Phipe/test/_temporary/_attempt_local_0001_m_000000_0
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:349)
Caused by: java.io.IOException: Mkdirs failed to create file:/output/Phipe/test/_temporary/_attempt_local_0001_m_000000_0
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:379)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:365)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:608)
at org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:896)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:393)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:354)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:480)
at org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:61)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:521)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:637)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:322)
at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:218)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:662)
15/03/11 03:51:31 INFO filecache.TrackerDistributedCacheManager: Deleted path /tmp/hadoop-hadoop1/mapred/local/archive/7149779269302380530_877075559_141947392/file/tmp/rhipe-temp-params-a72e6592b72eb53ab4a53f6187012d11
THANKS for anyone could help me :)