Not sure how many users here use hadoop. But, I am writing to this group as the test case i have is the HelloWorld of MapReduce, the word count and believe its more of a Scala/Java interop issue.I narrowed it down to my Map task implementation. I say this because when i reference and use a Java implementation of a equivalent( seemingly so) Java version, everything works fine.Java versionpublic class TokenCounterMapperJava extends Mapper<LongWritable, Text, Text, LongWritable> {
private final LongWritable one = new LongWritable(1);
private Text word = new Text();
@Override
public void map(LongWritable key, Text value, Context context)
throws IOException, InterruptedException {
String[] splits = value.toString().split("\\s+");
for (String split : splits ) {
word.set( split );
context.write(word, one);
}
}
}Scala versionclass TokenCounterMapper extends Mapper[LongWritable, Text, Text, LongWritable] {
val one = new LongWritable(1);
val word = new Text();
@Override
def map(key: LongWritable, value: Text, context: Context){
val splits = value.toString().split("\\s+");
for (split <- splits ) {
word.set( split );
context.write(word, one);
}
}
}
Here is a link to the full page of code in Scala.I get the following error. FYI, I get this error when i run it locally, not even on a cluster.############java.io.IOException: Type mismatch in key from map: expected org.apache.hadoop.io.Text, recieved org.apache.hadoop.io.LongWritable
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:871)
at org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:574)
at org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
at org.apache.hadoop.mapreduce.Mapper.map(Mapper.java:124)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:210)###########
I am using hadoop 0.20.2 specifically cdh3u4
thanks
Arun