I am hitting the following error while trying to launch an append task. How can I resolve this?
2015-03-30T01:24:22,050 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Performing action for task[merge_test_hostprocesses_461dcc1ac018127e6c9ac25d4a6e3e15da0a1941_2015-03-30T01:24:18.007Z]: LockListAction{}
2015-03-30T01:24:22,051 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[merge_test_hostprocesses_461dcc1ac018127e6c9ac25d4a6e3e15da0a1941_2015-03-30T01:24:18.007Z] to overlord[http://pascal-16.tetrationanalytics.com:8080/druid/indexer/v1/action]: LockListAction{}
2015-03-30T01:24:22,059 INFO [task-runner-0] io.druid.indexing.common.task.MergeTaskBase - Starting merge of id[merge_test_hostprocesses_461dcc1ac018127e6c9ac25d4a6e3e15da0a1941_2015-03-30T01:24:18.007Z], segments: [test_hostprocesses_2015-03-28T23:22:00.000-07:00_2015-03-28T23:23:00.000-07:00_2015-03-30T01:00:41.042Z, test_hostprocesses_2015-03-28T23:23:00.000-07:00_2015-03-28T23:24:00.000-07:00_2015-03-30T01:01:49.405Z]
WARN [org.apache.hadoop.util.NativeCodeLoader] Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
WARN [org.apache.hadoop.hdfs.BlockReaderLocal] The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2015-03-30T01:24:23,296 INFO [task-runner-0] io.druid.guice.PropertiesModule - Loading properties from runtime.properties
2015-03-30T01:24:23,302 INFO [task-runner-0] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.computation.buffer.size, ${base_path}.buffer.sizeBytes] on [io.druid.query.DruidProcessingConfig#intermediateComputeSizeBytes()]
2015-03-30T01:24:23,302 INFO [task-runner-0] org.skife.config.ConfigurationObjectFactory - Using method itself for [${base_path}.numThreads] on [io.druid.query.DruidProcessingConfig#getNumThreads()]
2015-03-30T01:24:23,302 INFO [task-runner-0] org.skife.config.ConfigurationObjectFactory - Using method itself for [${base_path}.columnCache.sizeBytes] on [io.druid.query.DruidProcessingConfig#columnCacheSizeBytes()]
2015-03-30T01:24:23,302 INFO [task-runner-0] org.skife.config.ConfigurationObjectFactory - Assigning default value [processing-%s] for [${base_path}.formatString] on [com.metamx.common.concurrent.ExecutorServiceConfig#getFormatString()]
2015-03-30T01:24:23,308 INFO [task-runner-0] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.segment.data.BitmapSerdeFactory] from props[druid.processing.bitmap.] as [io.druid.segment.data.BitmapSerde$DefaultBitmapSerdeFactory@98f7327]
2015-03-30T01:24:23,350 INFO [task-runner-0] io.druid.guice.PropertiesModule - Loading properties from runtime.properties
2015-03-30T01:24:23,358 INFO [task-runner-0] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.segment.data.BitmapSerdeFactory] from props[druid.processing.bitmap.] as [io.druid.segment.data.BitmapSerde$DefaultBitmapSerdeFactory@1fa40706]
2015-03-30T01:24:23,365 INFO [task-runner-0] io.druid.segment.IndexMerger - outDir[/tmp/persistent/task/merge_test_hostprocesses_461dcc1ac018127e6c9ac25d4a6e3e15da0a1941_2015-03-30T01:24:18.007Z/work/merged/v8-tmp] completed index.drd in 2 millis.
2015-03-30T01:24:23,368 ERROR [task-runner-0] io.druid.indexing.common.task.MergeTaskBase - Exception merging[test_hostprocesses]: {class=io.druid.indexing.common.task.MergeTaskBase, exceptionType=class java.io.IOException, exceptionMessage=No such file or directory, interval=2015-03-29T06:22:00.000Z/2015-03-29T06:24:00.000Z}
java.io.IOException: No such file or directory
at java.io.UnixFileSystem.createFileExclusively(Native Method) ~[?:1.8.0_25]
at java.io.File.createTempFile(File.java:2024) ~[?:1.8.0_25]
at java.io.File.createTempFile(File.java:2070) ~[?:1.8.0_25]
at io.druid.segment.data.TmpFileIOPeon.makeOutputStream(TmpFileIOPeon.java:42) ~[druid-services-0.7.0-selfcontained.jar:0.7.0]
at io.druid.segment.data.GenericIndexedWriter.open(GenericIndexedWriter.java:63) ~[druid-services-0.7.0-selfcontained.jar:0.7.0]
at io.druid.segment.IndexMerger.makeIndexFiles(IndexMerger.java:496) ~[druid-services-0.7.0-selfcontained.jar:0.7.0]
at io.druid.segment.IndexMerger.append(IndexMerger.java:399) ~[druid-services-0.7.0-selfcontained.jar:0.7.0]
at io.druid.segment.IndexMerger.append(IndexMerger.java:326) ~[druid-services-0.7.0-selfcontained.jar:0.7.0]
at io.druid.indexing.common.task.AppendTask.merge(AppendTask.java:106) ~[druid-services-0.7.0-selfcontained.jar:0.7.0]
at io.druid.indexing.common.task.MergeTaskBase.run(MergeTaskBase.java:146) [druid-services-0.7.0-selfcontained.jar:0.7.0]
at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:235) [druid-services-0.7.0-selfcontained.jar:0.7.0]
at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:214) [druid-services-0.7.0-selfcontained.jar:0.7.0]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_25]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_25]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_25]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_25]
2015-03-30T01:24:23,372 INFO [task-runner-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Removing task directory: /tmp/persistent/task/merge_test_hostprocesses_461dcc1ac018127e6c9ac25d4a6e3e15da0a1941_2015-03-30T01:24:18.007Z/work
2015-03-30T01:24:23,377 INFO [task-runner-0] io.druid.indexing.worker.executor.ExecutorLifecycle - Task completed with status: {
"id" : "merge_test_hostprocesses_461dcc1ac018127e6c9ac25d4a6e3e15da0a1941_2015-03-30T01:24:18.007Z",
"status" : "FAILED",
"duration" : 1324
}
Thanks,
Anubhav