java.net.URISyntaxException: Relative path in absolute URI

1,319 views
Skip to first unread message

Nikita Salnikov-Tarnovski

unread,
Aug 31, 2016, 9:06:27 AM8/31/16
to Druid User
I am trying to reindex my already existing data using Hadoop indexing task. My ioConfig follows:

"ioConfig": {
"type": "hadoop",
"inputSpec": {
"type": "dataSource",
"ingestionSpec": {
"dataSource": "Transactions",
"intervals": [
"2016-07-01T00:00:00Z/2016-09-01T00:00:00Z"
]
}
}
}

Task runs for some time and then fails with

2016-08-31T12:41:19,268 INFO [LocalJobRunner Map Task Executor #0] io.druid.indexer.hadoop.DatasourceRecordReader - Getting storage path for segment [Transactions_2016-08-25T18:00:00.000Z_2016-08-25T19:00:00.000Z_2016-08-25T18:01:02.792Z_3]
2016-08-31T12:41:19,268 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - Starting flush of map output
2016-08-31T12:41:19,270 INFO [Thread-53] org.apache.hadoop.mapred.LocalJobRunner - map task executor complete.
2016-08-31T12:41:19,271 WARN [Thread-53] org.apache.hadoop.mapred.LocalJobRunner - job_local1331578400_0001
java.lang.Exception: com.metamx.common.ISE: Unable to form simple file uri
	at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462) ~[hadoop-mapreduce-client-common-2.3.0.jar:?]
	at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:522) [hadoop-mapreduce-client-common-2.3.0.jar:?]
Caused by: com.metamx.common.ISE: Unable to form simple file uri
	at io.druid.indexer.JobHelper.getURIFromSegment(JobHelper.java:711) ~[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1]
	at io.druid.indexer.hadoop.DatasourceRecordReader$1.apply(DatasourceRecordReader.java:82) ~[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1]
	at io.druid.indexer.hadoop.DatasourceRecordReader$1.apply(DatasourceRecordReader.java:76) ~[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1]
	at com.google.common.collect.Lists$TransformingRandomAccessList$1.transform(Lists.java:582) ~[guava-16.0.1.jar:?]
	at com.google.common.collect.TransformedIterator.next(TransformedIterator.java:48) ~[guava-16.0.1.jar:?]
	at com.google.common.collect.TransformedIterator.next(TransformedIterator.java:48) ~[guava-16.0.1.jar:?]
	at com.metamx.common.guava.BaseSequence.makeYielder(BaseSequence.java:104) ~[java-util-0.27.9.jar:?]
	at com.metamx.common.guava.BaseSequence.toYielder(BaseSequence.java:81) ~[java-util-0.27.9.jar:?]
	at com.metamx.common.guava.ConcatSequence.toYielder(ConcatSequence.java:58) ~[java-util-0.27.9.jar:?]
	at io.druid.segment.realtime.firehose.IngestSegmentFirehose.<init>(IngestSegmentFirehose.java:171) ~[druid-server-0.9.1.1.jar:0.9.1.1]
	at io.druid.indexer.hadoop.DatasourceRecordReader.initialize(DatasourceRecordReader.java:114) ~[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1]
	at org.apache.hadoop.mapreduce.lib.input.DelegatingRecordReader.initialize(DelegatingRecordReader.java:84) ~[hadoop-mapreduce-client-core-2.3.0.jar:?]
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:525) ~[hadoop-mapreduce-client-core-2.3.0.jar:?]
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763) ~[hadoop-mapreduce-client-core-2.3.0.jar:?]
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) ~[hadoop-mapreduce-client-core-2.3.0.jar:?]
	at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243) ~[hadoop-mapreduce-client-common-2.3.0.jar:?]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_91]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_91]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_91]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_91]
	at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_91]
Caused by: java.net.URISyntaxException: Relative path in absolute URI: file:var/druid/segments/Transactions/2016-08-29T19:00:00.000Z_2016-08-29T20:00:00.000Z/2016-08-29T19:01:07.549Z/1/index.zip
	at java.net.URI.checkPath(URI.java:1823) ~[?:1.8.0_91]
	at java.net.URI.<init>(URI.java:745) ~[?:1.8.0_91]
	at io.druid.indexer.JobHelper.getURIFromSegment(JobHelper.java:708) ~[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1]
	at io.druid.indexer.hadoop.DatasourceRecordReader$1.apply(DatasourceRecordReader.java:82) ~[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1]
	at io.druid.indexer.hadoop.DatasourceRecordReader$1.apply(DatasourceRecordReader.java:76) ~[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1]
	at com.google.common.collect.Lists$TransformingRandomAccessList$1.transform(Lists.java:582) ~[guava-16.0.1.jar:?]
	at com.google.common.collect.TransformedIterator.next(TransformedIterator.java:48) ~[guava-16.0.1.jar:?]
	at com.google.common.collect.TransformedIterator.next(TransformedIterator.java:48) ~[guava-16.0.1.jar:?]
	at com.metamx.common.guava.BaseSequence.makeYielder(BaseSequence.java:104) ~[java-util-0.27.9.jar:?]
	at com.metamx.common.guava.BaseSequence.toYielder(BaseSequence.java:81) ~[java-util-0.27.9.jar:?]
	at com.metamx.common.guava.ConcatSequence.toYielder(ConcatSequence.java:58) ~[java-util-0.27.9.jar:?]
	at io.druid.segment.realtime.firehose.IngestSegmentFirehose.<init>(IngestSegmentFirehose.java:171) ~[druid-server-0.9.1.1.jar:0.9.1.1]
	at io.druid.indexer.hadoop.DatasourceRecordReader.initialize(DatasourceRecordReader.java:114) ~[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1]
	at org.apache.hadoop.mapreduce.lib.input.DelegatingRecordReader.initialize(DelegatingRecordReader.java:84) ~[hadoop-mapreduce-client-core-2.3.0.jar:?]
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:525) ~[hadoop-mapreduce-client-core-2.3.0.jar:?]
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763) ~[hadoop-mapreduce-client-core-2.3.0.jar:?]
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) ~[hadoop-mapreduce-client-core-2.3.0.jar:?]
	at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243) ~[hadoop-mapreduce-client-common-2.3.0.jar:?]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_91]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_91]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_91]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_91]
	at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_91]

Before this there are a lot records like

2016-08-31T12:41:13,449 INFO [LocalJobRunner Map Task Executor #0] io.druid.indexer.hadoop.DatasourceRecordReader - Getting storage path for segment [Transactions_2016-08-25T15:00:00.000Z_2016-08-25T16:00:00.000Z_2016-08-25T15:01:07.778Z]

without any errors.

Any way to fix it?

Fede Casanova

unread,
Sep 14, 2016, 10:56:16 AM9/14/16
to Druid User
Same problem here, did you find a solution?

Nikita Salnikov-Tarnovski

unread,
Sep 15, 2016, 5:58:52 AM9/15/16
to Druid User
Nope, switched to another node, which uses S3 as a deep storage.

Fangjin Yang

unread,
Oct 7, 2016, 4:42:25 PM10/7/16
to Druid User
Did you manage to get this working? I wonder if it is a problem with the ingest segment firehose.

Nikita Salnikov-Tarnovski

unread,
May 23, 2017, 6:46:14 AM5/23/17
to Druid User
No, still does not work in 0.9.2. Just needed to reindex data stored in local segments and got the same error.
Reply all
Reply to author
Forward
0 new messages