I'm trying to understand what Posix actually does. Looking at github it seems that this should be:
return new DateTime(input.longValue() * 1000);
Caused by: com.metamx.common.parsers.ParseException: Unparseable timestamp found!
at io.druid.data.input.impl.MapInputRowParser.parse(MapInputRowParser.java:72) ~[druid-api-0.9.2.jar:0.9.2]
at io.druid.data.input.impl.StringInputRowParser.parseMap(StringInputRowParser.java:136) ~[druid-api-0.9.2.jar:0.9.2]
at io.druid.data.input.impl.StringInputRowParser.parse(StringInputRowParser.java:131) ~[druid-api-0.9.2.jar:0.9.2]
at io.druid.indexer.HadoopyStringInputRowParser.parse(HadoopyStringInputRowParser.java:48) ~[druid-indexing-hadoop-0.9.2.jar:0.9.2]
at io.druid.indexer.HadoopDruidIndexerMapper.parseInputRow(HadoopDruidIndexerMapper.java:105) ~[druid-indexing-hadoop-0.9.2.jar:0.9.2]
at io.druid.indexer.HadoopDruidIndexerMapper.map(HadoopDruidIndexerMapper.java:72) ~[druid-indexing-hadoop-0.9.2.jar:0.9.2]
at io.druid.indexer.DetermineHashedPartitionsJob$DetermineCardinalityMapper.run(DetermineHashedPartitionsJob.java:285) ~[druid-indexing-hadoop-0.9.2.jar:0.9.2]
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) ~[hadoop-mapreduce-client-core-2.3.0.jar:?]
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) ~[hadoop-mapreduce-client-core-2.3.0.jar:?]
at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243) ~[hadoop-mapreduce-client-common-2.3.0.jar:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_73]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_73]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_73]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_73]
at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_73]
Caused by: java.lang.NumberFormatException: For input string: "1487249458.633"
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65) ~[?:1.8.0_73]
at java.lang.Long.parseLong(Long.java:589) ~[?:1.8.0_73]
at java.lang.Long.parseLong(Long.java:631) ~[?:1.8.0_73]
at com.metamx.common.parsers.TimestampParser$3.apply(TimestampParser.java:73) ~[java-util-0.27.10.jar:?]
at com.metamx.common.parsers.TimestampParser$3.apply(TimestampParser.java:68) ~[java-util-0.27.10.jar:?]
at com.metamx.common.parsers.TimestampParser$9.apply(TimestampParser.java:159) ~[java-util-0.27.10.jar:?]
at com.metamx.common.parsers.TimestampParser$9.apply(TimestampParser.java:150) ~[java-util-0.27.10.jar:?]
at io.druid.data.input.impl.TimestampSpec.extractTimestamp(TimestampSpec.java:97) ~[druid-api-0.9.2.jar:0.9.2]
at io.druid.data.input.impl.MapInputRowParser.parse(MapInputRowParser.java:60) ~[druid-api-0.9.2.jar:0.9.2]
at io.druid.data.input.impl.StringInputRowParser.parseMap(StringInputRowParser.java:136) ~[druid-api-0.9.2.jar:0.9.2]
at io.druid.data.input.impl.StringInputRowParser.parse(StringInputRowParser.java:131) ~[druid-api-0.9.2.jar:0.9.2]
at io.druid.indexer.HadoopyStringInputRowParser.parse(HadoopyStringInputRowParser.java:48) ~[druid-indexing-hadoop-0.9.2.jar:0.9.2]
at io.druid.indexer.HadoopDruidIndexerMapper.parseInputRow(HadoopDruidIndexerMapper.java:105) ~[druid-indexing-hadoop-0.9.2.jar:0.9.2]
at io.druid.indexer.HadoopDruidIndexerMapper.map(HadoopDruidIndexerMapper.java:72) ~[druid-indexing-hadoop-0.9.2.jar:0.9.2]
at io.druid.indexer.DetermineHashedPartitionsJob$DetermineCardinalityMapper.run(DetermineHashedPartitionsJob.java:285) ~[druid-indexing-hadoop-0.9.2.jar:0.9.2]
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) ~[hadoop-mapreduce-client-core-2.3.0.jar:?]
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) ~[hadoop-mapreduce-client-core-2.3.0.jar:?]
at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243) ~[hadoop-mapreduce-client-common-2.3.0.jar:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_73]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_73]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_73]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_73]
at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_73]
Why is posix the wrong choice in this case?
Thanks.
-William