12/04/27 17:21:17 INFO streaming.MongoStreamJob: Running
12/04/27 17:21:17 INFO streaming.MongoStreamJob: Init
12/04/27 17:21:17 INFO streaming.MongoStreamJob: Process Args
12/04/27 17:21:17 INFO streaming.StreamJobPatch: Setup Options'
12/04/27 17:21:17 INFO streaming.StreamJobPatch: PreProcess Args
12/04/27 17:21:17 INFO streaming.StreamJobPatch: Parse Options
12/04/27 17:21:17 INFO streaming.StreamJobPatch: Arg: '-mapper'
12/04/27 17:21:17 INFO streaming.StreamJobPatch: Arg: 'twit_map.py'
12/04/27 17:21:17 INFO streaming.StreamJobPatch: Arg: '-reducer'
12/04/27 17:21:17 INFO streaming.StreamJobPatch: Arg: 'twit_reduce.py'
12/04/27 17:21:17 INFO streaming.StreamJobPatch: Arg: '-inputURI'
12/04/27 17:21:17 INFO streaming.StreamJobPatch: Arg: '-outputURI'
12/04/27 17:21:17 INFO streaming.StreamJobPatch: Arg: '-file'
12/04/27 17:21:17 INFO streaming.StreamJobPatch: Arg: 'twit_map.py'
12/04/27 17:21:17 INFO streaming.StreamJobPatch: Arg: '-file'
12/04/27 17:21:17 INFO streaming.StreamJobPatch: Arg: 'twit_reduce.py'
12/04/27 17:21:17 INFO streaming.StreamJobPatch: Add InputSpecs
12/04/27 17:21:17 INFO streaming.StreamJobPatch: Setup output_
12/04/27 17:21:17 INFO streaming.StreamJobPatch: Post Process Args
12/04/27 17:21:17 INFO streaming.MongoStreamJob: Args processed.
2012-04-27 17:21:17.614 java[21100:1903] Unable to load realm info from SCDynamicStore
2012-04-27 17:21:17.752 java[21100:1903] Unable to load realm info from SCDynamicStore
12/04/27 17:21:18 INFO io.MongoIdentifierResolver: Resolving: bson
12/04/27 17:21:18 INFO io.MongoIdentifierResolver: Resolving: bson
12/04/27 17:21:18 INFO io.MongoIdentifierResolver: Resolving: bson
12/04/27 17:21:18 INFO io.MongoIdentifierResolver: Resolving: bson
packageJobJar: [twit_map.py, twit_reduce.py] [] /var/folders/wz/vmr658f56dn5ly79j7vzxy2c0000gq/T/streamjob668124738975197009.jar tmpDir=null
12/04/27 17:21:18 INFO streaming.MongoStreamJob: Input Format: com.mongodb.hadoop.mapred.MongoInputFormat@a50a649
12/04/27 17:21:18 INFO streaming.MongoStreamJob: Output Format: com.mongodb.hadoop.mapred.MongoOutputFormat@34d507e9
12/04/27 17:21:18 INFO streaming.MongoStreamJob: Key Class: class com.mongodb.hadoop.io.BSONWritable
12/04/27 17:21:18 WARN conf.Configuration:
session.id is deprecated. Instead, use dfs.metrics.session-id
12/04/27 17:21:18 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
12/04/27 17:21:18 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized
12/04/27 17:21:18 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
12/04/27 17:21:18 WARN conf.Configuration:
fs.default.name is deprecated. Instead, use fs.defaultFS
12/04/27 17:21:18 WARN conf.Configuration: mapred.used.genericoptionsparser is deprecated. Instead, use mapreduce.client.genericoptionsparser.used
12/04/27 17:21:18 INFO util.MongoSplitter: Calculate Splits Code ... Use Shards? false, Use Chunks? true; Collection Sharded? false
12/04/27 17:21:18 INFO util.MongoSplitter: Creation of Input Splits is enabled.
12/04/27 17:21:18 INFO util.MongoSplitter: Using Unsharded Split mode (Calculating multiple splits though)
12/04/27 17:21:18 INFO util.MongoSplitter: Calculating unsharded input splits on namespace 'test.live' with Split Key '{ "_id" : 1}' and a split size of '8'mb per
12/04/27 17:21:18 INFO util.MongoSplitter: Calculated 5 splits.
12/04/27 17:21:18 INFO input.MongoInputSplit: Creating a new MongoInputSplit for MongoURI 'mongodb://
127.0.0.1/test.live', query: '{ "$query" : { } , "$max" : { "_id" : { "$oid" : "4f981eb458407f60e02ec0ab"}}}', fieldSpec: '{ }', sort: '{ }', limit: 0, skip: 0 .
12/04/27 17:21:18 INFO input.MongoInputSplit: Creating a new MongoInputSplit for MongoURI 'mongodb://
127.0.0.1/test.live', query: '{ "$query" : { } , "$min" : { "_id" : { "$oid" : "4f981eb458407f60e02ec0ab"}} , "$max" : { "_id" : { "$oid" : "4f9916fa6ada108fa2771c7f"}}}', fieldSpec: '{ }', sort: '{ }', limit: 0, skip: 0 .
12/04/27 17:21:18 INFO input.MongoInputSplit: Creating a new MongoInputSplit for MongoURI 'mongodb://
127.0.0.1/test.live', query: '{ "$query" : { } , "$min" : { "_id" : { "$oid" : "4f9916fa6ada108fa2771c7f"}} , "$max" : { "_id" : { "$oid" : "4f9917486ada108fa27723e2"}}}', fieldSpec: '{ }', sort: '{ }', limit: 0, skip: 0 .
12/04/27 17:21:18 INFO input.MongoInputSplit: Creating a new MongoInputSplit for MongoURI 'mongodb://
127.0.0.1/test.live', query: '{ "$query" : { } , "$min" : { "_id" : { "$oid" : "4f9917486ada108fa27723e2"}} , "$max" : { "_id" : { "$oid" : "4f9917b16ada108fa2772b45"}}}', fieldSpec: '{ }', sort: '{ }', limit: 0, skip: 0 .
12/04/27 17:21:18 INFO input.MongoInputSplit: Creating a new MongoInputSplit for MongoURI 'mongodb://
127.0.0.1/test.live', query: '{ "$query" : { } , "$min" : { "_id" : { "$oid" : "4f9917b16ada108fa2772b45"}} , "$max" : { "_id" : { "$oid" : "4f9918596ada108fa27732a8"}}}', fieldSpec: '{ }', sort: '{ }', limit: 0, skip: 0 .
12/04/27 17:21:18 INFO input.MongoInputSplit: Creating a new MongoInputSplit for MongoURI 'mongodb://
127.0.0.1/test.live', query: '{ "$query" : { } , "$min" : { "_id" : { "$oid" : "4f9918596ada108fa27732a8"}}}', fieldSpec: '{ }', sort: '{ }', limit: 0, skip: 0 .
12/04/27 17:21:18 INFO mapreduce.JobSubmitter: number of splits:6
12/04/27 17:21:19 WARN mapred.LocalDistributedCacheManager: LocalJobRunner does not support symlinking into current working dir.
12/04/27 17:21:19 INFO mapred.LocalJobRunner: OutputCommitter set in config null
12/04/27 17:21:19 INFO mapred.LocalJobRunner: OutputCommitter is org.apache.hadoop.mapred.FileOutputCommitter
12/04/27 17:21:19 INFO mapreduce.Job: Running job: job_local_0001
12/04/27 17:21:19 INFO mapred.LocalJobRunner: Waiting for map tasks
12/04/27 17:21:19 INFO mapred.LocalJobRunner: Starting task: attempt_local_0001_m_000000_0
12/04/27 17:21:19 INFO mapred.Task: Using ResourceCalculatorPlugin : null
12/04/27 17:21:19 INFO mapred.MapTask: numReduceTasks: 1
12/04/27 17:21:19 INFO mapred.MapTask: (EQUATOR) 0 kvi 26214396(104857584)
12/04/27 17:21:19 INFO mapred.MapTask: mapreduce.task.io.sort.mb: 100
12/04/27 17:21:19 INFO mapred.MapTask: soft limit at 83886080
12/04/27 17:21:19 INFO mapred.MapTask: bufstart = 0; bufvoid = 104857600
12/04/27 17:21:19 INFO mapred.MapTask: kvstart = 26214396; length = 6553600
12/04/27 17:21:19 INFO streaming.PipeMapRed: PipeMapRed exec [/Users/tkid/Projects/mongo-hadoop/target/./twit_map.py]
12/04/27 17:21:19 WARN conf.Configuration:
fs.default.name is deprecated. Instead, use fs.defaultFS
12/04/27 17:21:20 INFO streaming.PipeMapRed: R/W/S=1/0/0 in:NA [rec/s] out:NA [rec/s]
12/04/27 17:21:20 INFO streaming.PipeMapRed: R/W/S=10/0/0 in:NA [rec/s] out:NA [rec/s]
12/04/27 17:21:20 INFO mapreduce.Job: Job job_local_0001 running in uber mode : false
12/04/27 17:21:20 INFO mapreduce.Job: map 0% reduce 0%
12/04/27 17:21:20 INFO streaming.PipeMapRed: R/W/S=100/0/0 in:NA [rec/s] out:NA [rec/s]
12/04/27 17:21:21 INFO streaming.PipeMapRed: Records R/W=518/1
12/04/27 17:21:22 INFO streaming.PipeMapRed: R/W/S=1000/728/0 in:500=1000/2 [rec/s] out:364=728/2 [rec/s]
12/04/27 17:21:23 INFO input.MongoRecordReader: Cursor exhausted.
Done Mapping.
12/04/27 17:21:23 INFO streaming.PipeMapRed: MRErrorThread done
12/04/27 17:21:23 INFO io.BSONWritable: No Length Header available.java.io.EOFException
12/04/27 17:21:23 INFO streaming.PipeMapRed: mapRedFinished
12/04/27 17:21:23 INFO mapred.LocalJobRunner:
12/04/27 17:21:23 INFO mapred.MapTask: Starting flush of map output
12/04/27 17:21:23 INFO mapred.MapTask: Spilling map output
12/04/27 17:21:23 INFO mapred.MapTask: bufstart = 0; bufend = 109200; bufvoid = 104857600
12/04/27 17:21:23 INFO mapred.MapTask: kvstart = 26214396(104857584); kvend = 26206840(104827360); length = 7557/6553600
12/04/27 17:21:24 INFO mapred.MapTask: Finished spill 0
12/04/27 17:21:24 INFO mapred.Task: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
12/04/27 17:21:24 INFO mapred.LocalJobRunner: Records R/W=518/1
12/04/27 17:21:24 INFO mapred.Task: Task 'attempt_local_0001_m_000000_0' done.
12/04/27 17:21:24 INFO mapred.LocalJobRunner: Finishing task: attempt_local_0001_m_000000_0
12/04/27 17:21:24 INFO mapred.LocalJobRunner: Starting task: attempt_local_0001_m_000001_0
12/04/27 17:21:24 INFO mapred.Task: Using ResourceCalculatorPlugin : null
12/04/27 17:21:24 INFO mapred.MapTask: numReduceTasks: 1
12/04/27 17:21:24 INFO mapred.MapTask: (EQUATOR) 0 kvi 26214396(104857584)
12/04/27 17:21:24 INFO mapred.MapTask: mapreduce.task.io.sort.mb: 100
12/04/27 17:21:24 INFO mapred.MapTask: soft limit at 83886080
12/04/27 17:21:24 INFO mapred.MapTask: bufstart = 0; bufvoid = 104857600
12/04/27 17:21:24 INFO mapred.MapTask: kvstart = 26214396; length = 6553600
12/04/27 17:21:24 INFO streaming.PipeMapRed: PipeMapRed exec [/Users/tkid/Projects/mongo-hadoop/target/./twit_map.py]
12/04/27 17:21:24 INFO streaming.PipeMapRed: R/W/S=1/0/0 in:NA [rec/s] out:NA [rec/s]
12/04/27 17:21:24 INFO streaming.PipeMapRed: R/W/S=10/0/0 in:NA [rec/s] out:NA [rec/s]
12/04/27 17:21:25 INFO streaming.PipeMapRed: R/W/S=100/0/0 in:NA [rec/s] out:NA [rec/s]
12/04/27 17:21:25 INFO mapreduce.Job: map 100% reduce 0%
12/04/27 17:21:26 INFO streaming.PipeMapRed: Records R/W=516/1
12/04/27 17:21:26 INFO streaming.PipeMapRed: R/W/S=1000/801/0 in:1000=1000/1 [rec/s] out:801=801/1 [rec/s]
12/04/27 17:21:27 INFO input.MongoRecordReader: Cursor exhausted.
Done Mapping.
12/04/27 17:21:27 INFO streaming.PipeMapRed: MRErrorThread done
12/04/27 17:21:27 INFO io.BSONWritable: No Length Header available.java.io.EOFException
12/04/27 17:21:27 INFO streaming.PipeMapRed: mapRedFinished
12/04/27 17:21:27 INFO mapred.LocalJobRunner:
12/04/27 17:21:27 INFO mapred.MapTask: Starting flush of map output
12/04/27 17:21:27 INFO mapred.MapTask: Spilling map output
12/04/27 17:21:27 INFO mapred.MapTask: bufstart = 0; bufend = 105925; bufvoid = 104857600
12/04/27 17:21:27 INFO mapred.MapTask: kvstart = 26214396(104857584); kvend = 26206836(104827344); length = 7561/6553600
12/04/27 17:21:27 INFO mapred.MapTask: Finished spill 0
12/04/27 17:21:27 INFO mapred.Task: Task:attempt_local_0001_m_000001_0 is done. And is in the process of commiting
12/04/27 17:21:27 INFO mapred.LocalJobRunner: Records R/W=516/1
12/04/27 17:21:27 INFO mapred.Task: Task 'attempt_local_0001_m_000001_0' done.
12/04/27 17:21:27 INFO mapred.LocalJobRunner: Finishing task: attempt_local_0001_m_000001_0
12/04/27 17:21:27 INFO mapred.LocalJobRunner: Starting task: attempt_local_0001_m_000002_0
12/04/27 17:21:27 INFO mapred.Task: Using ResourceCalculatorPlugin : null
12/04/27 17:21:27 INFO mapred.MapTask: numReduceTasks: 1
12/04/27 17:21:28 INFO mapred.MapTask: (EQUATOR) 0 kvi 26214396(104857584)
12/04/27 17:21:28 INFO mapred.MapTask: mapreduce.task.io.sort.mb: 100
12/04/27 17:21:28 INFO mapred.MapTask: soft limit at 83886080
12/04/27 17:21:28 INFO mapred.MapTask: bufstart = 0; bufvoid = 104857600
12/04/27 17:21:28 INFO mapred.MapTask: kvstart = 26214396; length = 6553600
12/04/27 17:21:28 INFO streaming.PipeMapRed: PipeMapRed exec [/Users/tkid/Projects/mongo-hadoop/target/./twit_map.py]
12/04/27 17:21:28 INFO streaming.PipeMapRed: R/W/S=1/0/0 in:NA [rec/s] out:NA [rec/s]
12/04/27 17:21:28 INFO streaming.PipeMapRed: R/W/S=10/0/0 in:NA [rec/s] out:NA [rec/s]
12/04/27 17:21:28 INFO streaming.PipeMapRed: R/W/S=100/0/0 in:NA [rec/s] out:NA [rec/s]
12/04/27 17:21:29 INFO streaming.PipeMapRed: Records R/W=557/1
12/04/27 17:21:29 INFO streaming.PipeMapRed: R/W/S=1000/505/0 in:1000=1000/1 [rec/s] out:505=505/1 [rec/s]
12/04/27 17:21:30 INFO input.MongoRecordReader: Cursor exhausted.
Done Mapping.
12/04/27 17:21:30 INFO streaming.PipeMapRed: MRErrorThread done
12/04/27 17:21:30 INFO io.BSONWritable: No Length Header available.java.io.EOFException
12/04/27 17:21:30 INFO streaming.PipeMapRed: mapRedFinished
12/04/27 17:21:30 INFO mapred.LocalJobRunner:
12/04/27 17:21:30 INFO mapred.MapTask: Starting flush of map output
12/04/27 17:21:30 INFO mapred.MapTask: Spilling map output
12/04/27 17:21:30 INFO mapred.MapTask: bufstart = 0; bufend = 103041; bufvoid = 104857600
12/04/27 17:21:30 INFO mapred.MapTask: kvstart = 26214396(104857584); kvend = 26206836(104827344); length = 7561/6553600
12/04/27 17:21:30 INFO mapred.MapTask: Finished spill 0
12/04/27 17:21:30 INFO mapred.Task: Task:attempt_local_0001_m_000002_0 is done. And is in the process of commiting
12/04/27 17:21:30 INFO mapred.LocalJobRunner: Records R/W=557/1
12/04/27 17:21:30 INFO mapred.Task: Task 'attempt_local_0001_m_000002_0' done.
12/04/27 17:21:30 INFO mapred.LocalJobRunner: Finishing task: attempt_local_0001_m_000002_0
12/04/27 17:21:30 INFO mapred.LocalJobRunner: Starting task: attempt_local_0001_m_000003_0
12/04/27 17:21:30 INFO mapred.Task: Using ResourceCalculatorPlugin : null
12/04/27 17:21:30 INFO mapred.MapTask: numReduceTasks: 1
12/04/27 17:21:31 INFO mapred.MapTask: (EQUATOR) 0 kvi 26214396(104857584)
12/04/27 17:21:31 INFO mapred.MapTask: mapreduce.task.io.sort.mb: 100
12/04/27 17:21:31 INFO mapred.MapTask: soft limit at 83886080
12/04/27 17:21:31 INFO mapred.MapTask: bufstart = 0; bufvoid = 104857600
12/04/27 17:21:31 INFO mapred.MapTask: kvstart = 26214396; length = 6553600
12/04/27 17:21:31 INFO streaming.PipeMapRed: PipeMapRed exec [/Users/tkid/Projects/mongo-hadoop/target/./twit_map.py]
12/04/27 17:21:31 INFO streaming.PipeMapRed: R/W/S=1/0/0 in:NA [rec/s] out:NA [rec/s]
12/04/27 17:21:31 INFO streaming.PipeMapRed: R/W/S=10/0/0 in:NA [rec/s] out:NA [rec/s]
12/04/27 17:21:31 INFO streaming.PipeMapRed: R/W/S=100/0/0 in:NA [rec/s] out:NA [rec/s]
12/04/27 17:21:32 INFO streaming.PipeMapRed: Records R/W=527/1
12/04/27 17:21:32 INFO streaming.PipeMapRed: R/W/S=1000/499/0 in:1000=1000/1 [rec/s] out:499=499/1 [rec/s]
12/04/27 17:21:33 INFO input.MongoRecordReader: Cursor exhausted.
Done Mapping.
12/04/27 17:21:33 INFO io.BSONWritable: No Length Header available.java.io.EOFException
12/04/27 17:21:33 INFO streaming.PipeMapRed: MRErrorThread done
12/04/27 17:21:33 INFO streaming.PipeMapRed: mapRedFinished
12/04/27 17:21:33 INFO mapred.LocalJobRunner:
12/04/27 17:21:33 INFO mapred.MapTask: Starting flush of map output
12/04/27 17:21:33 INFO mapred.MapTask: Spilling map output
12/04/27 17:21:33 INFO mapred.MapTask: bufstart = 0; bufend = 102427; bufvoid = 104857600
12/04/27 17:21:33 INFO mapred.MapTask: kvstart = 26214396(104857584); kvend = 26206836(104827344); length = 7561/6553600
12/04/27 17:21:33 INFO mapred.MapTask: Finished spill 0
12/04/27 17:21:33 INFO mapred.Task: Task:attempt_local_0001_m_000003_0 is done. And is in the process of commiting
12/04/27 17:21:33 INFO mapred.LocalJobRunner: Records R/W=527/1
12/04/27 17:21:33 INFO mapred.Task: Task 'attempt_local_0001_m_000003_0' done.
12/04/27 17:21:33 INFO mapred.LocalJobRunner: Finishing task: attempt_local_0001_m_000003_0
12/04/27 17:21:33 INFO mapred.LocalJobRunner: Starting task: attempt_local_0001_m_000004_0
12/04/27 17:21:33 INFO mapred.Task: Using ResourceCalculatorPlugin : null
12/04/27 17:21:33 INFO mapred.MapTask: numReduceTasks: 1
12/04/27 17:21:33 INFO mapred.MapTask: (EQUATOR) 0 kvi 26214396(104857584)
12/04/27 17:21:33 INFO mapred.MapTask: mapreduce.task.io.sort.mb: 100
12/04/27 17:21:33 INFO mapred.MapTask: soft limit at 83886080
12/04/27 17:21:33 INFO mapred.MapTask: bufstart = 0; bufvoid = 104857600
12/04/27 17:21:33 INFO mapred.MapTask: kvstart = 26214396; length = 6553600
12/04/27 17:21:33 INFO streaming.PipeMapRed: PipeMapRed exec [/Users/tkid/Projects/mongo-hadoop/target/./twit_map.py]
12/04/27 17:21:34 INFO streaming.PipeMapRed: R/W/S=1/0/0 in:NA [rec/s] out:NA [rec/s]
12/04/27 17:21:34 INFO streaming.PipeMapRed: R/W/S=10/0/0 in:NA [rec/s] out:NA [rec/s]
12/04/27 17:21:34 INFO streaming.PipeMapRed: R/W/S=100/0/0 in:NA [rec/s] out:NA [rec/s]
12/04/27 17:21:34 INFO streaming.PipeMapRed: Records R/W=559/1
12/04/27 17:21:35 INFO streaming.PipeMapRed: R/W/S=1000/515/0 in:1000=1000/1 [rec/s] out:515=515/1 [rec/s]
12/04/27 17:21:36 INFO input.MongoRecordReader: Cursor exhausted.
Done Mapping.
12/04/27 17:21:36 INFO streaming.PipeMapRed: MRErrorThread done
12/04/27 17:21:36 INFO io.BSONWritable: No Length Header available.java.io.EOFException
12/04/27 17:21:36 INFO streaming.PipeMapRed: mapRedFinished
12/04/27 17:21:36 INFO mapred.LocalJobRunner:
12/04/27 17:21:36 INFO mapred.MapTask: Starting flush of map output
12/04/27 17:21:36 INFO mapred.MapTask: Spilling map output
12/04/27 17:21:36 INFO mapred.MapTask: bufstart = 0; bufend = 99915; bufvoid = 104857600
12/04/27 17:21:36 INFO mapred.MapTask: kvstart = 26214396(104857584); kvend = 26206836(104827344); length = 7561/6553600
12/04/27 17:21:36 INFO mapreduce.Job: map 66% reduce 0%
12/04/27 17:21:36 INFO mapred.MapTask: Finished spill 0
12/04/27 17:21:36 INFO mapred.Task: Task:attempt_local_0001_m_000004_0 is done. And is in the process of commiting
12/04/27 17:21:36 INFO mapred.LocalJobRunner: Records R/W=559/1
12/04/27 17:21:36 INFO mapred.Task: Task 'attempt_local_0001_m_000004_0' done.
12/04/27 17:21:36 INFO mapred.LocalJobRunner: Finishing task: attempt_local_0001_m_000004_0
12/04/27 17:21:36 INFO mapred.LocalJobRunner: Starting task: attempt_local_0001_m_000005_0
12/04/27 17:21:36 INFO mapred.Task: Using ResourceCalculatorPlugin : null
12/04/27 17:21:36 INFO mapred.MapTask: numReduceTasks: 1
12/04/27 17:21:36 INFO mapred.MapTask: (EQUATOR) 0 kvi 26214396(104857584)
12/04/27 17:21:36 INFO mapred.MapTask: mapreduce.task.io.sort.mb: 100
12/04/27 17:21:36 INFO mapred.MapTask: soft limit at 83886080
12/04/27 17:21:36 INFO mapred.MapTask: bufstart = 0; bufvoid = 104857600
12/04/27 17:21:36 INFO mapred.MapTask: kvstart = 26214396; length = 6553600
12/04/27 17:21:36 INFO streaming.PipeMapRed: PipeMapRed exec [/Users/tkid/Projects/mongo-hadoop/target/./twit_map.py]
12/04/27 17:21:36 INFO streaming.PipeMapRed: R/W/S=1/0/0 in:NA [rec/s] out:NA [rec/s]
12/04/27 17:21:36 INFO streaming.PipeMapRed: R/W/S=10/0/0 in:NA [rec/s] out:NA [rec/s]
12/04/27 17:21:37 INFO streaming.PipeMapRed: R/W/S=100/0/0 in:NA [rec/s] out:NA [rec/s]
12/04/27 17:21:37 INFO mapreduce.Job: map 100% reduce 0%
12/04/27 17:21:37 INFO streaming.PipeMapRed: Records R/W=552/1
12/04/27 17:21:38 INFO streaming.PipeMapRed: R/W/S=1000/504/0 in:1000=1000/1 [rec/s] out:504=504/1 [rec/s]
12/04/27 17:21:38 INFO input.MongoRecordReader: Cursor exhausted.
Done Mapping.
12/04/27 17:21:38 INFO io.BSONWritable: No Length Header available.java.io.EOFException
12/04/27 17:21:38 INFO streaming.PipeMapRed: MRErrorThread done
12/04/27 17:21:38 INFO streaming.PipeMapRed: mapRedFinished
12/04/27 17:21:38 INFO mapred.LocalJobRunner:
12/04/27 17:21:38 INFO mapred.MapTask: Starting flush of map output
12/04/27 17:21:38 INFO mapred.MapTask: Spilling map output
12/04/27 17:21:38 INFO mapred.MapTask: bufstart = 0; bufend = 90089; bufvoid = 104857600
12/04/27 17:21:38 INFO mapred.MapTask: kvstart = 26214396(104857584); kvend = 26207708(104830832); length = 6689/6553600
12/04/27 17:21:39 INFO mapred.MapTask: Finished spill 0
12/04/27 17:21:39 INFO mapred.Task: Task:attempt_local_0001_m_000005_0 is done. And is in the process of commiting
12/04/27 17:21:39 INFO mapred.LocalJobRunner: Records R/W=552/1
12/04/27 17:21:39 INFO mapred.Task: Task 'attempt_local_0001_m_000005_0' done.
12/04/27 17:21:39 INFO mapred.LocalJobRunner: Finishing task: attempt_local_0001_m_000005_0
12/04/27 17:21:39 INFO mapred.LocalJobRunner: Map task executor complete.
12/04/27 17:21:39 INFO mapred.Task: Using ResourceCalculatorPlugin : null
12/04/27 17:21:39 INFO mapred.Merger: Merging 6 sorted segments
12/04/27 17:21:39 INFO mapred.Merger: Down to the last merge-pass, with 6 segments left of total size: 632791 bytes
12/04/27 17:21:39 INFO mapred.LocalJobRunner:
12/04/27 17:21:39 INFO streaming.PipeMapRed: PipeMapRed exec [/Users/tkid/Projects/mongo-hadoop/target/./twit_reduce.py]
12/04/27 17:21:39 INFO streaming.PipeMapRed: R/W/S=1/0/0 in:NA [rec/s] out:NA [rec/s]
12/04/27 17:21:39 INFO streaming.PipeMapRed: R/W/S=10/0/0 in:NA [rec/s] out:NA [rec/s]
12/04/27 17:21:39 INFO streaming.PipeMapRed: R/W/S=100/0/0 in:NA [rec/s] out:NA [rec/s]
12/04/27 17:21:39 INFO streaming.PipeMapRed: R/W/S=1000/0/0 in:NA [rec/s] out:NA [rec/s]
Processing Timezon None
Processing Timezon Abu Dhabi
Processing Timezon Adelaide
Processing Timezon Alaska
Processing Timezon Almaty
Processing Timezon Amsterdam
Processing Timezon Arizona
Processing Timezon Astana
Processing Timezon Athens
Processing Timezon Atlantic Time (Canada)
Processing Timezon Auckland
Processing Timezon Azores
Processing Timezon Baghdad
Processing Timezon Bangkok
Processing Timezon Beijing
Processing Timezon Belgrade
Processing Timezon Berlin
Processing Timezon Bern
Processing Timezon Bogota
Processing Timezon Brasilia
Processing Timezon Brisbane
Processing Timezon Brussels
Processing Timezon Bucharest
Processing Timezon Budapest
Processing Timezon Buenos Aires
Processing Timezon Cairo
Processing Timezon Canberra
Processing Timezon Cape Verde Is.
Processing Timezon Caracas
Processing Timezon Casablanca
Processing Timezon Central America
Processing Timezon Central Time (US & Canada)
Processing Timezon Chennai
Processing Timezon Chihuahua
Processing Timezon Copenhagen
Processing Timezon Dhaka
Processing Timezon Dublin
Processing Timezon Eastern Time (US & Canada)
Processing Timezon Edinburgh
Processing Timezon Ekaterinburg
Processing Timezon Fiji
Processing Timezon Georgetown
Processing Timezon Greenland
Processing Timezon Guadalajara
Processing Timezon Guam
Processing Timezon Hanoi
Processing Timezon Harare
Processing Timezon Hawaii
Processing Timezon Helsinki
Processing Timezon Hong Kong
Processing Timezon Indiana (East)
Processing Timezon International Date Line West
Processing Timezon Irkutsk
Processing Timezon Islamabad
Processing Timezon Istanbul
Processing Timezon Jakarta
Processing Timezon Jerusalem
Processing Timezon Kabul
Processing Timezon Karachi
Processing Timezon Kathmandu
Processing Timezon Kuala Lumpur
Processing Timezon Kuwait
Processing Timezon Kyiv
Processing Timezon La Paz
Processing Timezon Lima
Processing Timezon Lisbon
Processing Timezon Ljubljana
Processing Timezon London
Processing Timezon Madrid
Processing Timezon Mazatlan
Processing Timezon Melbourne
Processing Timezon Mexico City
Processing Timezon Mid-Atlantic
Processing Timezon Minsk
Processing Timezon Monterrey
Processing Timezon Moscow
Processing Timezon Mountain Time (US & Canada)
Processing Timezon Mumbai
Processing Timezon Muscat
Processing Timezon Nairobi
Processing Timezon New Caledonia
Processing Timezon New Delhi
Processing Timezon Novosibirsk
Processing Timezon Nuku'alofa
Processing Timezon Osaka
Processing Timezon Pacific Time (US & Canada)
12/04/27 17:21:40 INFO streaming.PipeMapRed: R/W/S=10000/0/0 in:10000=10000/1 [rec/s] out:0=0/1 [rec/s]
Processing Timezon Paris
Processing Timezon Perth
Processing Timezon Prague
Processing Timezon Pretoria
Processing Timezon Quito
Processing Timezon Riga
Processing Timezon Riyadh
Processing Timezon Rome
Processing Timezon Santiago
Processing Timezon Sapporo
Processing Timezon Sarajevo
Processing Timezon Seoul
Processing Timezon Singapore
Processing Timezon St. Petersburg
Processing Timezon Stockholm
Processing Timezon Sydney
Processing Timezon Taipei
Processing Timezon Tallinn
Processing Timezon Tashkent
Processing Timezon Tehran
Processing Timezon Tijuana
Processing Timezon Tokyo
Processing Timezon Ulaan Bataar
Processing Timezon Vienna
Processing Timezon Warsaw
Processing Timezon Wellington
Processing Timezon West Central Africa
Processing Timezon Yakutsk
Processing Timezon Zagreb
12/04/27 17:21:40 INFO streaming.PipeMapRed: MRErrorThread done
12/04/27 17:21:40 INFO streaming.PipeMapRed: Records R/W=11127/1
12/04/27 17:21:40 INFO io.BSONWritable: No Length Header available.java.io.EOFException
12/04/27 17:21:40 INFO streaming.PipeMapRed: mapRedFinished
12/04/27 17:21:40 INFO mapred.Task: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
12/04/27 17:21:40 INFO mapred.LocalJobRunner: Records R/W=11127/1 > reduce
12/04/27 17:21:40 INFO mapred.Task: Task 'attempt_local_0001_r_000000_0' done.
12/04/27 17:21:40 WARN mapred.LocalJobRunner: job_local_0001