3592367796239058 #趣找网#分享个好玩的活动给你们,都来看看吧。[礼物] 地址:http://t.cn/zH3sf2K 3592379158700163 #不旅行不青春#大家都来围观吧,给力奖品必须给力的来支持,好运来袭吧[礼物] 地址:http://t.cn/zH380tF 3592379204606985 #家的感动瞬间#喜欢就转发吧,下一个幸运会是你吗 地址:http://t.cn/zH33qsN 3592379209041862 #老友吧#[笑哈哈]期待好运呀,哇哈,支持!!!给力活动支持 地址:http://t.cn/zH334NZ 3592379209041745 #电信充值卡#[笑哈哈]哈哈,终于等到这个活动喽。等了好长时间了,来吧,大奖 地址:http://t.cn/zH1E5md 3592379255187784 好活动一定要珍惜,人生最珍贵的不是“得不到”和“已失去”而是现在能把握的幸福!平淡是真。[给力]
#! /usr/bin/R library(rmr2) map <- function(k,v) { keyval(v[[1]],v[[2]]) } reduce <- function(k,vv) { keyval(k, vv) } mapreduce( input = "/user/superman/senti/weibo-data.txt", output = "/user/superman/senti/output", input.format = make.input.format("csv", sep = "\t"), output.format = "text", map = map)
Loading required package: rmr2
Loading required package: Rcpp
Loading required package: methods
Loading required package: RJSONIO
Loading required package: bitops
Loading required package: digest
Loading required package: functional
Loading required package: stringr
Loading required package: plyr
Loading required package: reshape2
??????value[[3L]](cond) : 56?????2???
Calls: <Anonymous> ... tryCatch -> tryCatchList -> tryCatchOne -> <Anonymous>
?????
13/07/05 17:42:55 INFO streaming.PipeMapRed: MRErrorThread done
13/07/05 17:42:55 INFO streaming.PipeMapRed: log:null
R/W/S=761/0/0 in:NA [rec/s] out:NA [rec/s]
minRecWrittenToEnableSkip_=9223372036854775807 LOGNAME=null
HOST=null
USER=superman
HADOOP_USER=null
last Hadoop input: |null|
last tool output: |null|
Date: Fri Jul 05 17:42:55 CST 2013
java.io.IOException: Broken pipe
at java.io.FileOutputStream.writeBytes(Native Method)
at java.io.FileOutputStream.write(FileOutputStream.java:282)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:105)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:109)
at java.io.DataOutputStream.write(DataOutputStream.java:90)
at org.apache.hadoop.streaming.io.TextInputWriter.writeUTF8(TextInputWriter.java:72)
at org.apache.hadoop.streaming.io.TextInputWriter.writeValue(TextInputWriter.java:51)
at org.apache.hadoop.streaming.PipeMapper.map(PipeMapper.java:110)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:418)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:333)
at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:662)
13/07/05 17:42:55 WARN streaming.PipeMapRed: java.io.IOException: Broken pipe
at java.io.FileOutputStream.writeBytes(Native Method)
at java.io.FileOutputStream.write(FileOutputStream.java:282)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:105)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:123)
at java.io.DataOutputStream.flush(DataOutputStream.java:106)
at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:569)
at org.apache.hadoop.streaming.PipeMapper.map(PipeMapper.java:125)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:418)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:333)
at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:662)
13/07/05 17:42:55 INFO streaming.PipeMapRed: mapRedFinished
13/07/05 17:42:55 WARN streaming.PipeMapRed: java.io.IOException: Bad file descriptor
at java.io.FileOutputStream.writeBytes(Native Method)
at java.io.FileOutputStream.write(FileOutputStream.java:282)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:105)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:123)
at java.io.DataOutputStream.flush(DataOutputStream.java:106)
at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:569)
at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:136)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:418)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:333)
at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:662)
13/07/05 17:42:55 INFO streaming.PipeMapRed: mapRedFinished
13/07/05 17:42:55 INFO mapred.LocalJobRunner: Map task executor complete.
13/07/05 17:42:55 INFO streaming.StreamJob: map 0% reduce 0%
13/07/05 17:42:55 WARN mapred.LocalJobRunner: job_local1776539990_0001
java.lang.Exception: java.io.IOException: log:null
R/W/S=761/0/0 in:NA [rec/s] out:NA [rec/s]
minRecWrittenToEnableSkip_=9223372036854775807 LOGNAME=null
HOST=null
USER=superman
HADOOP_USER=null
last Hadoop input: |null|
last tool output: |null|
Date: Fri Jul 05 17:42:55 CST 2013
java.io.IOException: Broken pipe
at java.io.FileOutputStream.writeBytes(Native Method)
at java.io.FileOutputStream.write(FileOutputStream.java:282)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:105)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:109)
at java.io.DataOutputStream.write(DataOutputStream.java:90)
at org.apache.hadoop.streaming.io.TextInputWriter.writeUTF8(TextInputWriter.java:72)
at org.apache.hadoop.streaming.io.TextInputWriter.writeValue(TextInputWriter.java:51)
at org.apache.hadoop.streaming.PipeMapper.map(PipeMapper.java:110)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:418)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:333)
at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:662)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:404)
Caused by: java.io.IOException: log:null
R/W/S=761/0/0 in:NA [rec/s] out:NA [rec/s]
minRecWrittenToEnableSkip_=9223372036854775807 LOGNAME=null
HOST=null
USER=superman
HADOOP_USER=null
last Hadoop input: |null|
last tool output: |null|
Date: Fri Jul 05 17:42:55 CST 2013
java.io.IOException: Broken pipe
at java.io.FileOutputStream.writeBytes(Native Method)
at java.io.FileOutputStream.write(FileOutputStream.java:282)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:105)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:109)
at java.io.DataOutputStream.write(DataOutputStream.java:90)
at org.apache.hadoop.streaming.io.TextInputWriter.writeUTF8(TextInputWriter.java:72)
at org.apache.hadoop.streaming.io.TextInputWriter.writeValue(TextInputWriter.java:51)
at org.apache.hadoop.streaming.PipeMapper.map(PipeMapper.java:110)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:418)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:333)
at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:662)
at org.apache.hadoop.streaming.PipeMapper.map(PipeMapper.java:126)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:418)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:333)
at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:662)
13/07/05 17:42:56 INFO streaming.StreamJob: Job running in-process (local Hadoop)
13/07/05 17:42:56 ERROR streaming.StreamJob: Job not successful. Error: NA
13/07/05 17:42:56 INFO streaming.StreamJob: killJob...
Streaming Command Failed!
´íÎóÓÚmr(map = map, reduce = reduce, combine = combine, vectorized.reduce, :
hadoop streaming failed with error code 1
--
post: rha...@googlegroups.com ||
unsubscribe: rhadoop+u...@googlegroups.com ||
web: https://groups.google.com/d/forum/rhadoop?hl=en-US
---
You received this message because you are subscribed to the Google Groups "RHadoop" group.
To unsubscribe from this group and stop receiving emails from it, send an email to rhadoop+u...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
--
post: rha...@googlegroups.com ||
unsubscribe: rhadoop+u...@googlegroups.com ||
web: https://groups.google.com/d/forum/rhadoop?hl=en-US
---
You received this message because you are subscribed to a topic in the Google Groups "RHadoop" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/rhadoop/qhy7jnGKRyA/unsubscribe.
To unsubscribe from this group and all its topics, send an email to rhadoop+u...@googlegroups.com.
It's a private function, you should prefix it with rmr2::
This is not encouraged or supported. It's just for the sake of the experiment, if it works we'll find a better way to reuse that function. But I think there will be other problems, let me test your data and I'll let you know.
Antonio
You received this message because you are subscribed to the Google Groups "RHadoop" group.
To unsubscribe from this group and stop receiving emails from it, send an email to rhadoop+u...@googlegroups.com.
1.没有心情的心情是怎样的心情。2.在不同的经度,纬度,邂逅不同的人。3.一个人只有在静下心来的时候,才能看到最真的自己。4.一个人一句话,足以让你回忆一辈子。5.别动不动就哭,世界上没那么多爱你的人。6.跟自己说声对不起,因为曾经为了别人难为了自己。7.青春并不忧伤,却被我们演绎的如此凄凉。 #1元租赁美国百年户外品牌整套露营装#我在不停的转发,就让我奖中奖中奖吧,我好期盼能够得到哦,希望在博主这能好运连连 地址:http://t.cn/zHde8D0 #2013年中大促#强烈支持转发。转发此微博:期待~ 地址:http://t.cn/zHnmcik #20万拥有太湖一套房#小编辛苦啦,我会一直支持你们! 地址:http://t.cn/zH31Te7 3530471382430953 转发微博 3533065332432400 是一种刚强的意志 //@白百何:青春不是生命的一个阶段,它是一种精神状态[赞] 3533208782562449 就要旅游去了,但是怎么高兴不起来呢。 http://t.cn/zjqskLv 3533785037480677 //@柯蓝://@雾满拦江: 3538004087967675 【快来微博搜索 找朋友 吧!】通过关注分组帮你寻觅散落在微博上的身边好友;独有星座筛选功能,帮你在朋友关系圈中找感情和事业最佳配对!http://t.cn/zjC6PAh 3540683807368088 缓慢思考 答案便萌生出来 于是透彻领悟 听到的声响 泪如雨下 在意的 却总是伤害我 3543473439477398 春节版的微博Android客户端好喜庆~我给大家拜个年送个红包#让红包飞# mini汽车、千台HTC手机等你拿!猛戳:http://t.cn/zYLWxxq 新版特性:1.新增精美新年主题;2.全新话题功能;3.支持分享到微信!http://t.cn/zY5SGaL 3543519027697663 [馋嘴]@乐属我佳 3544567473334631 @乐属我佳 如来,[嘻嘻] 3544679737811062 @乐属我佳 3544679839057754 [兔子]@乐属我佳 3545285781756747 转发微博 3545286054459769 //@徐凡1991:转发微博 3545286972995826 转发微博 3545322477016534 我正在使用#微博二维码#,扫描下面的二维码就能关注我啦,快来和我一起玩转微博吧~ 我在:http://t.cn/zYJCbNm 3545323240567558 明天,明天噢! 貌似很纠结,也很期待。[嘻嘻]不管怎样都很开心的,有情人的陪伴…[偷笑] 我在这里:http://t.cn/zjJM7rt 3545554682606600 //@于正1978:[哈哈][哈哈][哈哈][哈哈]接财神!@宁财神 3591188953463792 3546623147781339 转发微博 3546671864646880 @乐属我佳 3546689577518621 转发微博 3548057734274342 转发微博 3548058556060679 人生就是如此吧,得不到了永远会是好的,应该学会知足。[嘻嘻]@刘佳佳0915 3548060955658148 真纠结!@乐属我佳 3548061404066954 转发微博 3548801442416379 转发微博 3548801471778707 转发微博 3548801518165870 转发微博 3548812335449099 我刚领取了微号731678197,【搜索】或【@】731678197就可以快速找到我。超炫靓号、专属标识、闪亮勋章~ 我就是微博潮人,你也来加入吧( http://t.cn/zOWeMad ) 3548817380688886 某 这句戳到我了 3548820400566585 即使发现自己错了 也要坚持走下去 你给我的 是一种近乎偏执的态度 3548820933535019 有木有花样美男的感觉 就是那种靠脸蛋取胜的男孩纸 哦哈哈 3552205195578206 3548939498646751 原来可以这样… 3577512506827103 3549273289767137 元宵节爬山的哦!嘿嘿… 开心… 3549272576500446 3549273545265088 转发微博 3549506509247128 我刚刚更换了个人主页封面图,欢迎大家围观哦~@刘佳佳0915 http://weibo.com/3211291673/profile 3549507198113305 早上还没吃饭呢,就这样馋我 3549507349353875 3549507528524241 转发微博 3549691901399324 今天我看见它开了,有种快乐在心里! 你就是美,就是美,美、美、美、、、、[偷笑] 我在:http://t.cn/zjxblNy 3549692040670001 转发微博 3549692523110872 转发微博 3549692875201660 转发微博 3549693131146222 好酷
2124 1750 2132 997 2133 1540 2135 3549 2134 3438 2136 1457 2137 1348 2123 3312 2138 63 2139 3219 1287 1548 2140 27 1288 2804 1289 3619
--
post: rha...@googlegroups.com ||
unsubscribe: rhadoop+u...@googlegroups.com ||
web: https://groups.google.com/d/forum/rhadoop?hl=en-US
---
You received this message because you are subscribed to the Google Groups "RHadoop" group.To unsubscribe from this group and stop receiving emails from it, send an email to rhadoop+u...@googlegroups.com.
--
post: rha...@googlegroups.com ||
unsubscribe: rhadoop+u...@googlegroups.com ||
web: https://groups.google.com/d/forum/rhadoop?hl=en-US
---
You received this message because you are subscribed to a topic in the Google Groups "RHadoop" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/rhadoop/qhy7jnGKRyA/unsubscribe.
To unsubscribe from this group and all its topics, send an email to rhadoop+u...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
--
post: rha...@googlegroups.com ||
unsubscribe: rhadoop+u...@googlegroups.com ||
web: https://groups.google.com/d/forum/rhadoop?hl=en-US
---
You received this message because you are subscribed to the Google Groups "RHadoop" group.
To unsubscribe from this group and stop receiving emails from it, send an email to rhadoop+u...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
--
post: rha...@googlegroups.com ||
unsubscribe: rhadoop+u...@googlegroups.com ||
web: https://groups.google.com/d/forum/rhadoop?hl=en-US
---
You received this message because you are subscribed to a topic in the Google Groups "RHadoop" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/rhadoop/qhy7jnGKRyA/unsubscribe.
To unsubscribe from this group and all its topics, send an email to rhadoop+u...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
--
post: rha...@googlegroups.com ||
unsubscribe: rhadoop+u...@googlegroups.com ||
web: https://groups.google.com/d/forum/rhadoop?hl=en-US
---
You received this message because you are subscribed to the Google Groups "RHadoop" group.
To unsubscribe from this group and stop receiving emails from it, send an email to rhadoop+u...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
--
post: rha...@googlegroups.com ||
unsubscribe: rhadoop+u...@googlegroups.com ||
web: https://groups.google.com/d/forum/rhadoop?hl=en-US
---
You received this message because you are subscribed to a topic in the Google Groups "RHadoop" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/rhadoop/qhy7jnGKRyA/unsubscribe.
To unsubscribe from this group and all its topics, send an email to rhadoop+u...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
c("�, "f", "�", "�, "�, "�� "��", "�, "�, "�� "��", "�, "�� "��", "�, "��", "�, "�� "��", "�, "�� "��", "�, "��", "�, "�� "��", "�, "�[", "�, "�� "��", "�, "�� "��", "�, "�h", "ttp", "t", "cn", "zh", "3", "sf", "2", "k") 3592367796239058 c("�, "�� "��", "�, "�� "��", "�, "�� "��", "�, "�� "��", "�, "�� "�, "�� "��", "�, "�� "��", "�, "�� "�� "��", "�, "�� "�, "�� "��", "�, "�� "�", "�, "�, "�� "��", "�, "�� "��", "�, "�, "�, "�� "��", "�, "�� "��", "�, "�h", "ttp", "t", "cn", "zh", "380", "tf") 3592379158700163 c("�, "�� "��", "�, "��", "�, "�, "�, "�� "��", "�, "�, "�� "��", "�, "�� "��", "��", "�, "�� "��", "�, "�� "��", "�, "�� "��", "�, "�� "��", "�, "�h", "ttp", "t", "cn", "zh", "33", "qsn") 3592379204606985 c("Կ", "�� "��", "�, "�, "�� "��", "�, "�]", "�", "�", "��", "�, "�� "��", "��, "�, "�� "��", "�, "�� "��", "�, "�, "�� "��", "�, "�� "��", "�, "�� "��", "�, "�� "�", "�, "�, "�, "�� "��", "�, "�h", "ttp", "t", "cn", "zh", "334", "nz") 3592379209041862 c("�, "�� "��", "�, "�� "�� "��", "�, "�� "��", "�, "�]", "�, "�� "��", "�, "�� "��", "�, "�� "��", "�, "�� "��", "�, "�, "�� "�", "�, "�� "�� "��", "�, "��, "�, "�� "��", "�, "�� "�� "�, "ɽ", "�", "�, "��", "�, "�, "�� "��", "�, "�h", "ttp", "t", "cn", "zh", "1", "e", "5", "md") 3592379209041745
Hi Antonio,
##' @author Jian Li <\email{...@sina.com}>