never mind my previous post I had a misconfiguration in my xml files. Would really appretiate any help as I am really stuck.
[hdoopuser@localhost hadoop]$ /opt/hadoop/hadoop/bin/hadoop jar /opt/mongo-hadoop/examples/treasury_yield/target/treasury-example_1.1.2-1.1.0.jar com.mongodb.hadoop.examples.treasury.TreasuryYieldXMLConfig
13/06/18 14:09:27 INFO util.MongoTool: Created a conf: 'Configuration: core-default.xml, core-site.xml, mongo-defaults.xml, mongo-treasury_yield.xml' on {class com.mongodb.hadoop.examples.treasury.TreasuryYieldXMLConfig} as job named '<unnamed MongoTool job>'
13/06/18 14:09:27 INFO util.MongoTool: Mapper Class: class com.mongodb.hadoop.examples.treasury.TreasuryYieldMapper
13/06/18 14:09:27 INFO util.MongoTool: Setting up and running MapReduce job in foreground, will wait for results. {Verbose? true}
13/06/18 14:09:28 INFO util.MongoSplitter: MongoSplitter calculating splits
13/06/18 14:09:28 INFO util.MongoSplitter: use shards: false
13/06/18 14:09:28 INFO util.MongoSplitter: use chunks: true
13/06/18 14:09:28 INFO util.MongoSplitter: collection sharded: false
13/06/18 14:09:28 INFO util.MongoSplitter: use range queries: false
13/06/18 14:09:28 INFO util.MongoSplitter: Creation of Input Splits is enabled.
13/06/18 14:09:28 INFO util.MongoSplitter: Using Unsharded Split mode (Calculating multiple splits though)
13/06/18 14:09:28 INFO util.MongoSplitter: Calculating unsharded input splits on namespace '
mongo_hadoop.yield_historical.in' with Split Key '{ "_id" : 1}' and a split size of '8'mb per
13/06/18 14:09:28 WARN util.MongoSplitter: WARNING: No Input Splits were calculated by the split code. Proceeding with a *single* split. Data may be too small, try lowering 'mongo.input.split_size' if this is undesirable.
13/06/18 14:09:28 INFO util.MongoSplitter: MongoSplitter found 1 splits.
13/06/18 14:09:29 INFO mapred.JobClient: Running job: job_201306181409_0001
13/06/18 14:09:30 INFO mapred.JobClient: map 0% reduce 0%
13/06/18 14:09:38 INFO mapred.JobClient: Task Id : attempt_201306181409_0001_m_000000_0, Status : FAILED
java.io.IOException: wrong value class: class com.mongodb.hadoop.io.BSONWritable is not class org.apache.hadoop.io.DoubleWritable
at org.apache.hadoop.mapred.IFile$Writer.append(IFile.java:167)
at org.apache.hadoop.mapred.Task$CombineOutputCollector.collect(Task.java:1168)
at org.apache.hadoop.mapred.Task$NewCombinerRunner$OutputConverter.write(Task.java:1492)
at org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
at com.mongodb.hadoop.examples.treasury.TreasuryYieldReducer.reduce(TreasuryYieldReducer.java:62)
at com.mongodb.hadoop.examples.treasury.TreasuryYieldReducer.reduce(TreasuryYieldReducer.java:39)
at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:176)
at org.apache.hadoop.mapred.Task$NewCombinerRunner.combine(Task.java:1513)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1436)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1298)
at org.apache.hadoop.mapred.MapTask$NewOutputCollector.close(MapTask.java:699)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
13/06/18 14:09:44 INFO mapred.JobClient: Task Id : attempt_201306181409_0001_m_000000_1, Status : FAILED
java.io.IOException: wrong value class: class com.mongodb.hadoop.io.BSONWritable is not class org.apache.hadoop.io.DoubleWritable
at org.apache.hadoop.mapred.IFile$Writer.append(IFile.java:167)
at org.apache.hadoop.mapred.Task$CombineOutputCollector.collect(Task.java:1168)
at org.apache.hadoop.mapred.Task$NewCombinerRunner$OutputConverter.write(Task.java:1492)
at org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
at com.mongodb.hadoop.examples.treasury.TreasuryYieldReducer.reduce(TreasuryYieldReducer.java:62)
at com.mongodb.hadoop.examples.treasury.TreasuryYieldReducer.reduce(TreasuryYieldReducer.java:39)
at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:176)
at org.apache.hadoop.mapred.Task$NewCombinerRunner.combine(Task.java:1513)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1436)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1298)
at org.apache.hadoop.mapred.MapTask$NewOutputCollector.close(MapTask.java:699)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
13/06/18 14:09:50 INFO mapred.JobClient: Task Id : attempt_201306181409_0001_m_000000_2, Status : FAILED
java.io.IOException: wrong value class: class com.mongodb.hadoop.io.BSONWritable is not class org.apache.hadoop.io.DoubleWritable
at org.apache.hadoop.mapred.IFile$Writer.append(IFile.java:167)
at org.apache.hadoop.mapred.Task$CombineOutputCollector.collect(Task.java:1168)
at org.apache.hadoop.mapred.Task$NewCombinerRunner$OutputConverter.write(Task.java:1492)
at org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
at com.mongodb.hadoop.examples.treasury.TreasuryYieldReducer.reduce(TreasuryYieldReducer.java:62)
at com.mongodb.hadoop.examples.treasury.TreasuryYieldReducer.reduce(TreasuryYieldReducer.java:39)
at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:176)
at org.apache.hadoop.mapred.Task$NewCombinerRunner.combine(Task.java:1513)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1436)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1298)
at org.apache.hadoop.mapred.MapTask$NewOutputCollector.close(MapTask.java:699)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
13/06/18 14:09:59 INFO mapred.JobClient: Job complete: job_201306181409_0001
13/06/18 14:09:59 INFO mapred.JobClient: Counters: 7
13/06/18 14:09:59 INFO mapred.JobClient: Job Counters
13/06/18 14:09:59 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=27722
13/06/18 14:09:59 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0
13/06/18 14:09:59 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0
13/06/18 14:09:59 INFO mapred.JobClient: Rack-local map tasks=4
13/06/18 14:09:59 INFO mapred.JobClient: Launched map tasks=4
13/06/18 14:09:59 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0
13/06/18 14:09:59 INFO mapred.JobClient: Failed map tasks=1