Unable to load the Data follow the Documentation example

946 views
Skip to first unread message

tao...@yahoo-inc.com

unread,
Mar 25, 2014, 11:46:10 AM3/25/14
to druid-de...@googlegroups.com
I follow the Documentation http://druid.io/docs/0.6.73/Tutorial%3A-Loading-Your-Data-Part-1.html,

using this command to load data:

curl -X 'POST' -H 'Content-Type:application/json' -d @examples/indexing/wikipedia_index_task.json localhost:8087/druid/indexer/v1/task

It had exception:
2014-03-25 15:09:24,800 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Will try again in [PT60S].
2014-03-25 15:10:24,801 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[index_wikipedia_2014-03-25T15:09:19.063Z] to overlord[http://localhost:8087/druid/indexer/v1/action]: SegmentInsertAction{segments=[DataSegment{size=4462, shardSpec=NoneShardSpec, metrics=[count, added, deleted, delta], dimensions=[anonymous, city, continent, country, language, namespace, newpage, page, region, robot, unpatrolled, user], version='2014-03-25T15:09:19.071Z', loadSpec={type=local, path=/tmp/druid/localStorage/wikipedia/2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z/2014-03-25T15:09:19.071Z/0/index.zip}, interval=2013-08-31T00:00:00.000Z/2013-09-01T00:00:00.000Z, dataSource='wikipedia', binaryVersion='9'}]}
2014-03-25 15:10:24,815 WARN [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Exception submitting action for task[index_wikipedia_2014-03-25T15:09:19.063Z]
com.fasterxml.jackson.core.JsonParseException: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
 at [Source: java.io.StringReader@19b02ae1; line: 1, column: 2]
	at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1369)
	at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:532)
	at com.fasterxml.jackson.core.base.ParserMinimalBase._reportUnexpectedChar(ParserMinimalBase.java:453)
	at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._handleUnexpectedValue(ReaderBasedJsonParser.java:1386)
	at com.fasterxml.jackson.core.json.ReaderBasedJsonParser.nextToken(ReaderBasedJsonParser.java:669)
	at com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:2926)
	at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:2873)
	at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2041)
	at io.druid.indexing.common.actions.RemoteTaskActionClient.submit(RemoteTaskActionClient.java:102)
	at io.druid.indexing.common.TaskToolbox.pushSegments(TaskToolbox.java:204)
	at io.druid.indexing.common.task.IndexTask.run(IndexTask.java:168)
	at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:224)
	at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:203)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
	at java.util.concurrent.FutureTask.run(FutureTask.java:166)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:724)

What reason cause this exception?


Fangjin Yang

unread,
Mar 25, 2014, 1:13:12 PM3/25/14
to druid-de...@googlegroups.com
Hi,

I tried just the example with 0.6.73 and it seemed fine as is. Are you by chance running the indexing service in remote mode (it looks that way)? It appears that some exception occurred while running the task (you can take a closer look in the task logs), likely because of some configuration problem and an error to be returned that can't be deserialized. BTW, if you are running in remote mode, make sure to set druid.selectors.indexing.serviceName so the middle manager can find the overlord.

-- FJ

tao...@yahoo-inc.com

unread,
Mar 25, 2014, 8:04:39 PM3/25/14
to druid-de...@googlegroups.com
HI, Fangjin:

I used 4 nodes to build Druid cluster, the historical node, coordinator node, broker node and real-time node deploy on these 4 nodes. 
The Indexing service was started on the same node as real-time.
The configuration I used were defaulted for each nodes, I haven't done any modification. 
The middle manager I haven't started, So I run it on the one of 4 nodes.
Here is the configuration of middle manager:
druid.host=localhost
druid.port=8091
druid.service=middleManager

druid.zk.service.host=localhost

druid.db.connector.connectURI=jdbc:mysql://localhost:3306/druid
druid.db.connector.user=druid
druid.db.connector.password=diurd
druid.selectors.indexing.serviceName=overlord
druid.indexer.runner.startPort=8092
druid.indexer.fork.property.druid.computation.buffer.size=268435456

But it has the same exception.

Thanks,
Tao

Fangjin Yang

unread,
Mar 25, 2014, 8:05:57 PM3/25/14
to druid-de...@googlegroups.com
Hi Tao,

When you say the indexing service is running on the same node as the real-time node, are you running an overlord or a middle manager?


--
You received this message because you are subscribed to the Google Groups "Druid Development" group.
To unsubscribe from this group and stop receiving emails from it, send an email to druid-developm...@googlegroups.com.
To post to this group, send email to druid-de...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/druid-development/1c67fa76-ca61-4fbc-8342-55f62c460ecd%40googlegroups.com.

For more options, visit https://groups.google.com/d/optout.

tao...@yahoo-inc.com

unread,
Mar 25, 2014, 8:26:18 PM3/25/14
to druid-de...@googlegroups.com
Hi, Fangjin:

I run an overlord on the same node as real-time node, they are sharing the same physical machine.

Thanks,
Tao

Fangjin Yang

unread,
Mar 25, 2014, 8:44:08 PM3/25/14
to druid-de...@googlegroups.com
Hi Tao,

Can you post more of the stack trace on the overlord as you submit the task? Also, do you have the logs of the task itself?? You can get them from overlord_ip:port/console.html. The logs tell me the overlord thinks it is in remote mode. Also, you've shared for configuration for the middle manager, which should not be needed unless we are running in remote mode (overlord + middle managers versus just an overlord, which we refer to as local mode).

Thanks,
FJ

tao...@yahoo-inc.com

unread,
Mar 25, 2014, 10:36:46 PM3/25/14
to druid-de...@googlegroups.com
Hi, Fangjin:
Here is the log from UI:
2014-03-25 15:09:19,681 INFO [main] io.druid.server.initialization.PropertiesModule - Loading properties from runtime.properties
2014-03-25 15:09:19,718 INFO [main] org.hibernate.validator.internal.util.Version - HV000001: Hibernate Validator 5.0.1.Final
2014-03-25 15:09:20,313 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.ExtensionsConfig] from props[druid.extensions.] as [ExtensionsConfig{searchCurrentClassloader=true, coordinates=[], localRepository='/home/taoluo/.m2/repository', remoteRepositories=[http://repo1.maven.org/maven2/, https://metamx.artifactoryonline.com/metamx/pub-libs-releases-local]}]
2014-03-25 15:09:21,245 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class com.metamx.emitter.core.LoggingEmitterConfig] from props[druid.emitter.logging.] as [LoggingEmitterConfig{loggerClass='com.metamx.emitter.core.LoggingEmitter', logLevel='info'}]
2014-03-25 15:09:21,329 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.metrics.DruidMonitorSchedulerConfig] from props[druid.monitoring.] as [io.druid.server.metrics.DruidMonitorSchedulerConfig@3d83370f]
2014-03-25 15:09:21,354 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.metrics.MonitorsConfig] from props[druid.monitoring.] as [MonitorsConfig{monitors=[]}]
2014-03-25 15:09:21,383 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.DruidNode] from props[druid.] as [DruidNode{serviceName='overlord', host='localhost:8088', port=8088}]
2014-03-25 15:09:21,462 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.indexing.common.config.TaskConfig] from props[druid.indexer.task.] as [io.druid.indexing.common.config.TaskConfig@36dc1a15]
2014-03-25 15:09:21,469 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.guice.HttpClientModule$DruidHttpClientConfig] from props[druid.global.http.] as [io.druid.guice.HttpClientModule$DruidHttpClientConfig@782fc709]
2014-03-25 15:09:21,538 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.client.indexing.IndexingServiceSelectorConfig] from props[druid.selectors.indexing.] as [io.druid.client.indexing.IndexingServiceSelectorConfig@1c529aa0]
2014-03-25 15:09:21,541 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [localhost] for [druid.zk.service.host] on [io.druid.curator.CuratorConfig#getZkHosts()]
2014-03-25 15:09:21,543 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning default value [30000] for [druid.zk.service.sessionTimeoutMs] on [io.druid.curator.CuratorConfig#getZkSessionTimeoutMs()]
2014-03-25 15:09:21,543 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning default value [false] for [druid.curator.compress] on [io.druid.curator.CuratorConfig#enableCompression()]
2014-03-25 15:09:21,630 WARN [main] org.apache.curator.retry.ExponentialBackoffRetry - maxRetries too large (30). Pinning to 29
2014-03-25 15:09:21,684 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.CuratorDiscoveryConfig] from props[druid.discovery.curator.] as [io.druid.server.initialization.CuratorDiscoveryConfig@6c8ff873]
2014-03-25 15:09:21,894 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.indexing.common.RetryPolicyConfig] from props[druid.peon.taskActionClient.retry.] as [io.druid.indexing.common.RetryPolicyConfig@52c40b60]
2014-03-25 15:09:21,899 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.segment.loading.LocalDataSegmentPusherConfig] from props[druid.storage.] as [io.druid.segment.loading.LocalDataSegmentPusherConfig@14c54049]
2014-03-25 15:09:21,919 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.client.DruidServerConfig] from props[druid.server.] as [io.druid.client.DruidServerConfig@63859f83]
2014-03-25 15:09:21,920 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.base] on [io.druid.server.initialization.ZkPathsConfig#getZkBasePath()]
2014-03-25 15:09:21,921 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.loadQueuePath] on [io.druid.server.initialization.ZkPathsConfig#getLoadQueuePath()]
2014-03-25 15:09:21,921 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.coordinatorPath] on [io.druid.server.initialization.ZkPathsConfig#getCoordinatorPath()]
2014-03-25 15:09:21,921 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.indexer.statusPath] on [io.druid.server.initialization.ZkPathsConfig#getIndexerStatusPath()]
2014-03-25 15:09:21,921 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.indexer.leaderLatchPath] on [io.druid.server.initialization.ZkPathsConfig#getIndexerLeaderLatchPath()]
2014-03-25 15:09:21,921 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.announcementsPath] on [io.druid.server.initialization.ZkPathsConfig#getAnnouncementsPath()]
2014-03-25 15:09:21,921 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.servedSegmentsPath] on [io.druid.server.initialization.ZkPathsConfig#getServedSegmentsPath()]
2014-03-25 15:09:21,922 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.connectorPath] on [io.druid.server.initialization.ZkPathsConfig#getConnectorPath()]
2014-03-25 15:09:21,922 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.indexer.announcementsPath] on [io.druid.server.initialization.ZkPathsConfig#getIndexerAnnouncementPath()]
2014-03-25 15:09:21,922 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.indexer.tasksPath] on [io.druid.server.initialization.ZkPathsConfig#getIndexerTaskPath()]
2014-03-25 15:09:21,922 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.propertiesPath] on [io.druid.server.initialization.ZkPathsConfig#getPropertiesPath()]
2014-03-25 15:09:21,923 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.liveSegmentsPath] on [io.druid.server.initialization.ZkPathsConfig#getLiveSegmentsPath()]
2014-03-25 15:09:21,978 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.BatchDataSegmentAnnouncerConfig] from props[druid.announcer.] as [io.druid.server.initialization.BatchDataSegmentAnnouncerConfig@fda8f12]
2014-03-25 15:09:21,982 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.server.coordination.DataSegmentAnnouncerProvider] from props[druid.announcer.] as [io.druid.server.coordination.LegacyDataSegmentAnnouncerProvider@5ae5d5ca]
2014-03-25 15:09:21,988 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.client.ServerInventoryViewProvider] from props[druid.announcer.] as [io.druid.client.SingleServerInventoryProvider@1454a16a]
2014-03-25 15:09:21,996 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.QueryConfig] from props[druid.query.] as [io.druid.query.QueryConfig@4f0c9799]
2014-03-25 15:09:22,003 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.search.search.SearchQueryConfig] from props[druid.query.search.] as [io.druid.query.search.search.SearchQueryConfig@73e625]
2014-03-25 15:09:22,010 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.groupby.GroupByQueryConfig] from props[druid.query.groupBy.] as [io.druid.query.groupby.GroupByQueryConfig@48551176]
2014-03-25 15:09:22,011 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [268435456] for [druid.computation.buffer.size] on [io.druid.server.DruidProcessingConfig#intermediateComputeSizeBytes()]
2014-03-25 15:09:22,011 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [1] for [druid.processing.numThreads] on [com.metamx.common.concurrent.ExecutorServiceConfig#getNumThreads()]
2014-03-25 15:09:22,011 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning default value [processing-%s] for [${base_path}.formatString] on [com.metamx.common.concurrent.ExecutorServiceConfig#getFormatString()]
2014-03-25 15:09:22,021 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.topn.TopNQueryConfig] from props[druid.query.topN.] as [io.druid.query.topn.TopNQueryConfig@7510cbc3]
2014-03-25 15:09:22,031 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.server.log.RequestLoggerProvider] from props[druid.request.logging.] as [io.druid.server.log.NoopRequestLoggerProvider@60b39ad9]
2014-03-25 15:09:22,038 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.ServerConfig] from props[druid.server.http.] as [io.druid.server.initialization.ServerConfig@70fab733]
2014-03-25 15:09:22,108 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.emitter.core.LoggingEmitter.start()] on object[com.metamx.emitter.core.LoggingEmitter@51d22b42].
2014-03-25 15:09:22,109 INFO [main] com.metamx.emitter.core.LoggingEmitter - Start: started [true]
2014-03-25 15:09:22,109 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.emitter.service.ServiceEmitter.start()] on object[com.metamx.emitter.service.ServiceEmitter@7a380c5a].
2014-03-25 15:09:22,109 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.metrics.MonitorScheduler.start()] on object[com.metamx.metrics.MonitorScheduler@75b32765].
2014-03-25 15:09:22,112 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.http.client.HttpClient.start()] on object[com.metamx.http.client.HttpClient@71ad7c17].
2014-03-25 15:09:22,112 INFO [main] io.druid.curator.CuratorModule - Starting Curator
2014-03-25 15:09:22,112 INFO [main] org.apache.curator.framework.imps.CuratorFrameworkImpl - Starting
2014-03-25 15:09:22,121 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:zookeeper.version=3.4.5-1392090, built on 09/30/2012 17:52 GMT
2014-03-25 15:09:22,121 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:host.name=wordheard.corp.sg3.yahoo.com
2014-03-25 15:09:22,121 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.version=1.7.0_25
2014-03-25 15:09:22,121 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.vendor=Oracle Corporation
2014-03-25 15:09:22,121 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.home=/home/y/libexec64/jdk1.7.0/jre
2014-03-25 15:09:22,121 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.class.path=lib/aether-connector-okhttp-0.0.9.jar:lib/jetty-continuation-8.1.11.v20130520.jar:lib/zookeeper-3.4.5.jar:lib/classmate-0.8.0.jar:lib/airline-0.6.jar:lib/curator-client-2.4.0.jar:lib/jetty-servlet-8.1.11.v20130520.jar:lib/http-client-0.8.5.jar:lib/jersey-core-1.17.1.jar:lib/commons-io-2.0.1.jar:lib/sigar-1.6.5.132.jar:lib/jackson-dataformat-smile-2.2.3.jar:lib/aether-connector-file-0.9.0.M2.jar:lib/jersey-servlet-1.17.1.jar:lib/extendedset-1.3.4.jar:lib/aether-spi-0.9.0.M2.jar:lib/jersey-guice-1.17.1.jar:lib/server-metrics-0.0.9.jar:lib/curator-framework-2.4.0.jar:lib/tesla-aether-0.0.5.jar:lib/protobuf-java-2.4.0a.jar:lib/wagon-provider-api-2.4.jar:lib/druid-services-0.6.73.jar:lib/java-xmlbuilder-0.4.jar:lib/jsr305-2.0.1.jar:lib/slf4j-api-1.6.4.jar:lib/aws-java-sdk-1.6.0.1.jar:lib/jboss-logging-3.1.1.GA.jar:lib/java-util-0.25.3.jar:lib/joda-time-2.1.jar:lib/maven-model-3.1.1.jar:lib/druid-server-0.6.73.jar:lib/commons-cli-1.2.jar:lib/emitter-0.2.9.jar:lib/jackson-annotations-2.2.3.jar:lib/druid-processing-0.6.73.jar:lib/jackson-core-asl-1.9.13.jar:lib/xpp3-1.1.4c.jar:lib/irc-api-1.0-0011.jar:lib/guice-servlet-4.0-beta.jar:lib/commons-codec-1.7.jar:lib/netty-3.2.4.Final.jar:lib/maven-aether-provider-3.1.1.jar:lib/aether-api-0.9.0.M2.jar:lib/druid-common-0.6.73.jar:lib/google-http-client-jackson2-1.15.0-rc.jar:lib/maven-model-builder-3.1.1.jar:lib/commons-lang-2.6.jar:lib/jackson-jaxrs-json-provider-2.2.3.jar:lib/jets3t-0.9.0.jar:lib/guice-4.0-beta.jar:lib/jetty-servlets-8.1.11.v20130520.jar:lib/aether-impl-0.9.0.M2.jar:lib/jetty-http-8.1.11.v20130520.jar:lib/jackson-core-2.2.3.jar:lib/maven-settings-3.1.1.jar:lib/antlr4-runtime-4.0.jar:lib/druid-api-0.1.11.jar:lib/geoip2-0.4.0.jar:lib/httpcore-4.2.jar:lib/maven-settings-builder-3.1.1.jar:lib/maven-repository-metadata-3.1.1.jar:lib/commons-dbcp-1.4.jar:lib/maxminddb-0.2.0.jar:lib/google-http-client-1.15.0-rc.jar:lib/aopalliance-1.0.jar:lib/jline-0.9.94.jar:lib/validation-api-1.1.0.Final.jar:lib/opencsv-2.3.jar:lib/jetty-security-8.1.11.v20130520.jar:lib/javax.inject-1.jar:lib/curator-recipes-2.4.0.jar:lib/okhttp-1.0.2.jar:lib/javax.servlet-3.0.0.v201112011016.jar:lib/jdbi-2.32.jar:lib/aether-util-0.9.0.M2.jar:lib/bytebuffer-collections-0.0.2.jar:lib/jackson-datatype-guava-2.2.3.jar:lib/jackson-mapper-asl-1.9.13.jar:lib/org.abego.treelayout.core-1.0.1.jar:lib/commons-pool-1.6.jar:lib/jetty-io-8.1.11.v20130520.jar:lib/httpclient-4.2.jar:lib/guava-14.0.1.jar:lib/slf4j-log4j12-1.6.2.jar:lib/rhino-1.7R4.jar:lib/jetty-server-8.1.11.v20130520.jar:lib/jetty-client-8.1.11.v20130520.jar:lib/curator-x-discovery-2.4.0.jar:lib/guice-multibindings-4.0-beta.jar:lib/hibernate-validator-5.0.1.Final.jar:lib/spymemcached-2.8.4.jar:lib/jackson-jaxrs-base-2.2.3.jar:lib/plexus-utils-3.0.15.jar:lib/plexus-interpolation-1.19.jar:lib/asm-3.1.jar:lib/jersey-server-1.17.1.jar:lib/jetty-util-8.1.11.v20130520.jar:lib/config-magic-0.9.jar:lib/icu4j-4.8.1.jar:lib/commons-logging-1.1.1.jar:lib/jackson-databind-2.2.3.jar:lib/druid-indexing-hadoop-0.6.73.jar:lib/compress-lzf-0.8.4.jar:lib/lz4-1.1.2.jar:lib/mysql-connector-java-5.1.18.jar:lib/druid-indexing-service-0.6.73.jar:lib/jackson-module-jaxb-annotations-2.2.3.jar:lib/log4j-1.2.16.jar:lib/jackson-datatype-joda-2.2.3.jar:/home/taoluo/software/hadoop-2.2.0/*:/home/taoluo/software/hadoop-2.2.0/lib/*:/home/taoluo/software/hadoop-2.2.0/etc/hadoop/:config/overlord
2014-03-25 15:09:22,121 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.library.path=/home/y/lib64
2014-03-25 15:09:22,121 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.io.tmpdir=/tmp
2014-03-25 15:09:22,121 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.compiler=<NA>
2014-03-25 15:09:22,121 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:os.name=Linux
2014-03-25 15:09:22,121 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:os.arch=amd64
2014-03-25 15:09:22,122 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:os.version=2.6.32-358.6.2.el6.YAHOO.20130516.x86_64
2014-03-25 15:09:22,122 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:user.name=taoluo
2014-03-25 15:09:22,122 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:user.home=/home/taoluo
2014-03-25 15:09:22,122 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:user.dir=/home/taoluo/software/druid-services-0.6.73
2014-03-25 15:09:22,122 INFO [main] org.apache.zookeeper.ZooKeeper - Initiating client connection, connectString=localhost sessionTimeout=30000 watcher=org.apache.curator.ConnectionState@421d7ce3
2014-03-25 15:09:22,140 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.curator.discovery.ServerDiscoverySelector.start() throws java.lang.Exception] on object[io.druid.curator.discovery.ServerDiscoverySelector@2d9d4f83].
2014-03-25 15:09:22,141 INFO [main-SendThread(localhost.localdomain:2181)] org.apache.zookeeper.ClientCnxn - Opening socket connection to server localhost.localdomain/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (unknown error)
2014-03-25 15:09:22,146 INFO [main-SendThread(localhost.localdomain:2181)] org.apache.zookeeper.ClientCnxn - Socket connection established to localhost.localdomain/0:0:0:0:0:0:0:1:2181, initiating session
2014-03-25 15:09:22,164 INFO [main-SendThread(localhost.localdomain:2181)] org.apache.zookeeper.ClientCnxn - Session establishment complete on server localhost.localdomain/0:0:0:0:0:0:0:1:2181, sessionid = 0x144edf549800014, negotiated timeout = 30000
2014-03-25 15:09:22,170 INFO [main-EventThread] org.apache.curator.framework.state.ConnectionStateManager - State change: CONNECTED
2014-03-25 15:09:23,292 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.curator.announcement.Announcer.start()] on object[io.druid.curator.announcement.Announcer@c840dcf].
2014-03-25 15:09:23,294 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.client.ServerInventoryView.start() throws java.lang.Exception] on object[io.druid.client.SingleServerInventoryView@1b17a0c2].
2014-03-25 15:09:23,299 INFO [main] org.eclipse.jetty.server.Server - jetty-8.1.11.v20130520
2014-03-25 15:09:23,363 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/servedSegments/localhost:8083
2014-03-25 15:09:23,364 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Starting inventory cache for localhost:8083, inventoryPath /druid/servedSegments/localhost:8083
2014-03-25 15:09:23,364 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - New Server[DruidServerMetadata{name='localhost:8083', host='localhost:8083', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
2014-03-25 15:09:23,364 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/servedSegments/localhost:8081
2014-03-25 15:09:23,365 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Starting inventory cache for localhost:8081, inventoryPath /druid/servedSegments/localhost:8081
2014-03-25 15:09:23,365 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - New Server[DruidServerMetadata{name='localhost:8081', host='localhost:8081', maxSize=10000000000, tier='_default_tier', type='historical', priority='0'}]
2014-03-25 15:09:23,388 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[localhost:8083] added segment[wikipedia_2014-03-24T16:00:00.000Z_2014-03-24T17:00:00.000Z_2014-03-24T16:00:00.000Z]
2014-03-25 15:09:23,389 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[localhost:8083] added segment[wikipedia_2014-03-23T09:00:00.000Z_2014-03-23T10:00:00.000Z_2014-03-23T09:00:00.000Z]
2014-03-25 15:09:23,390 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[localhost:8083] added segment[wikipedia_2014-03-23T14:00:00.000Z_2014-03-23T15:00:00.000Z_2014-03-23T14:00:00.000Z]
2014-03-25 15:09:23,391 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[localhost:8083] added segment[wikipedia_2014-03-25T07:00:00.000Z_2014-03-25T08:00:00.000Z_2014-03-25T07:00:00.000Z]
Mar 25, 2014 3:09:23 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider as a provider class
Mar 25, 2014 3:09:23 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering io.druid.server.StatusResource as a root resource class
Mar 25, 2014 3:09:23 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.17.1 02/28/2013 12:47 PM'
Mar 25, 2014 3:09:23 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider to GuiceManagedComponentProvider with the scope "Singleton"
Mar 25, 2014 3:09:23 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding io.druid.server.QueryResource to GuiceInstantiatedComponentProvider
Mar 25, 2014 3:09:23 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding io.druid.indexing.worker.executor.ChatHandlerResource to GuiceInstantiatedComponentProvider
Mar 25, 2014 3:09:23 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding io.druid.server.StatusResource to GuiceManagedComponentProvider with the scope "Undefined"
2014-03-25 15:09:24,034 INFO [main] org.eclipse.jetty.server.AbstractConnector - Started SelectChann...@0.0.0.0:8088
2014-03-25 15:09:24,035 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.server.coordination.AbstractDataSegmentAnnouncer.start()] on object[io.druid.server.coordination.SingleDataSegmentAnnouncer@3881e03c].
2014-03-25 15:09:24,035 INFO [main] io.druid.server.coordination.AbstractDataSegmentAnnouncer - Announcing self[DruidServerMetadata{name='localhost:8088', host='localhost:8088', maxSize=0, tier='_default_tier', type='indexer-executor', priority='0'}] at [/druid/announcements/localhost:8088]
2014-03-25 15:09:24,075 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.server.coordination.AbstractDataSegmentAnnouncer.start()] on object[io.druid.server.coordination.BatchDataSegmentAnnouncer@79917214].
2014-03-25 15:09:24,075 INFO [main] io.druid.server.coordination.AbstractDataSegmentAnnouncer - Announcing self[DruidServerMetadata{name='localhost:8088', host='localhost:8088', maxSize=0, tier='_default_tier', type='indexer-executor', priority='0'}] at [/druid/announcements/localhost:8088]
2014-03-25 15:09:24,077 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.indexing.worker.executor.ExecutorLifecycle.start()] on object[io.druid.indexing.worker.executor.ExecutorLifecycle@127a923c].
2014-03-25 15:09:24,081 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/servedSegments/localhost:8088
2014-03-25 15:09:24,081 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Starting inventory cache for localhost:8088, inventoryPath /druid/servedSegments/localhost:8088
2014-03-25 15:09:24,081 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - New Server[DruidServerMetadata{name='localhost:8088', host='localhost:8088', maxSize=0, tier='_default_tier', type='indexer-executor', priority='0'}]
2014-03-25 15:09:24,211 INFO [main] io.druid.indexing.worker.executor.ExecutorLifecycle - Running with task: {
  "type" : "index",
  "id" : "index_wikipedia_2014-03-25T15:09:19.063Z",
  "dataSource" : "wikipedia",
  "granularitySpec" : {
    "type" : "uniform",
    "gran" : "DAY",
    "intervals" : [ "2013-08-31T00:00:00.000Z/2013-09-01T00:00:00.000Z" ]
  },
  "spatialDimensions" : [ ],
  "aggregators" : [ {
    "type" : "count",
    "name" : "count"
  }, {
    "type" : "doubleSum",
    "name" : "added",
    "fieldName" : "added"
  }, {
    "type" : "doubleSum",
    "name" : "deleted",
    "fieldName" : "deleted"
  }, {
    "type" : "doubleSum",
    "name" : "delta",
    "fieldName" : "delta"
  } ],
  "indexGranularity" : {
    "type" : "none"
  },
  "targetPartitionSize" : 0,
  "firehose" : {
    "type" : "local",
    "baseDir" : "/home/taoluo/software/druid-services-0.6.73/examples/indexing",
    "filter" : "wikipedia_data.json",
    "parser" : {
      "type" : "string",
      "timestampSpec" : {
        "column" : "timestamp",
        "format" : "auto"
      },
      "data" : {
        "format" : "json",
        "dimensions" : [ "page", "language", "user", "unpatrolled", "newPage", "robot", "anonymous", "namespace", "continent", "country", "region", "city" ],
        "spatialDimensions" : [ ]
      },
      "dimensionExclusions" : [ "timestamp" ]
    }
  },
  "rowFlushBoundary" : 0,
  "groupId" : "index_wikipedia_2014-03-25T15:09:19.063Z",
  "interval" : "2013-08-31T00:00:00.000Z/2013-09-01T00:00:00.000Z",
  "resource" : {
    "availabilityGroup" : "index_wikipedia_2014-03-25T15:09:19.063Z",
    "requiredCapacity" : 1
  }
}
2014-03-25 15:09:24,214 INFO [main] io.druid.indexing.common.actions.RemoteTaskActionClient - Performing action for task[index_wikipedia_2014-03-25T15:09:19.063Z]: LockTryAcquireAction{interval=2013-08-31T00:00:00.000Z/2013-09-01T00:00:00.000Z}
2014-03-25 15:09:24,222 INFO [main] io.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[index_wikipedia_2014-03-25T15:09:19.063Z] to overlord[http://localhost:8087/druid/indexer/v1/action]: LockTryAcquireAction{interval=2013-08-31T00:00:00.000Z/2013-09-01T00:00:00.000Z}
2014-03-25 15:09:24,238 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://localhost:8087
2014-03-25 15:09:24,270 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://localhost:8087
2014-03-25 15:09:24,271 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://localhost:8087
2014-03-25 15:09:24,272 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://localhost:8087
2014-03-25 15:09:24,282 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://localhost:8087
2014-03-25 15:09:24,382 INFO [task-runner-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Running task: index_wikipedia_2014-03-25T15:09:19.063Z
2014-03-25 15:09:24,383 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Performing action for task[index_wikipedia_2014-03-25T15:09:19.063Z]: LockListAction{}
2014-03-25 15:09:24,387 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[index_wikipedia_2014-03-25T15:09:19.063Z] to overlord[http://localhost:8087/druid/indexer/v1/action]: LockListAction{}
2014-03-25 15:09:24,468 INFO [task-runner-0] io.druid.indexing.common.index.YeOldePlumberSchool - Spilling index[0] with rows[5] to: /tmp/persistent/task/index_wikipedia_2014-03-25T15:09:19.063Z/work/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z_0/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z/spill0
2014-03-25 15:09:24,474 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting persist for interval[2013-08-31T00:00:00.000Z/2013-08-31T12:41:27.001Z], rows[5]
2014-03-25 15:09:24,492 INFO [task-runner-0] io.druid.segment.IndexMerger - outDir[/tmp/persistent/task/index_wikipedia_2014-03-25T15:09:19.063Z/work/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z_0/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z/spill0/v8-tmp] completed index.drd in 6 millis.
2014-03-25 15:09:24,530 INFO [task-runner-0] io.druid.segment.IndexMerger - outDir[/tmp/persistent/task/index_wikipedia_2014-03-25T15:09:19.063Z/work/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z_0/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z/spill0/v8-tmp] completed dim conversions in 33 millis.
2014-03-25 15:09:24,550 INFO [task-runner-0] io.druid.segment.CompressedPools - Allocating new chunkEncoder[1]
2014-03-25 15:09:24,563 INFO [task-runner-0] io.druid.segment.IndexMerger - outDir[/tmp/persistent/task/index_wikipedia_2014-03-25T15:09:19.063Z/work/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z_0/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z/spill0/v8-tmp] completed walk through of 5 rows in 33 millis.
2014-03-25 15:09:24,565 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[anonymous] with cardinality[1]
2014-03-25 15:09:24,574 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[anonymous] in 11 millis.
2014-03-25 15:09:24,575 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[city] with cardinality[5]
2014-03-25 15:09:24,578 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[city] in 3 millis.
2014-03-25 15:09:24,578 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[continent] with cardinality[3]
2014-03-25 15:09:24,580 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[continent] in 2 millis.
2014-03-25 15:09:24,580 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[country] with cardinality[5]
2014-03-25 15:09:24,581 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[country] in 1 millis.
2014-03-25 15:09:24,582 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[language] with cardinality[4]
2014-03-25 15:09:24,583 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[language] in 2 millis.
2014-03-25 15:09:24,583 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[namespace] with cardinality[2]
2014-03-25 15:09:24,584 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[namespace] in 1 millis.
2014-03-25 15:09:24,584 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[newpage] with cardinality[2]
2014-03-25 15:09:24,585 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[newpage] in 1 millis.
2014-03-25 15:09:24,585 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[page] with cardinality[5]
2014-03-25 15:09:24,587 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[page] in 2 millis.
2014-03-25 15:09:24,587 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[region] with cardinality[5]
2014-03-25 15:09:24,588 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[region] in 1 millis.
2014-03-25 15:09:24,588 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[robot] with cardinality[2]
2014-03-25 15:09:24,590 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[robot] in 2 millis.
2014-03-25 15:09:24,590 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[unpatrolled] with cardinality[2]
2014-03-25 15:09:24,593 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[unpatrolled] in 3 millis.
2014-03-25 15:09:24,593 INFO [task-runner-0] io.druid.segment.IndexMerger - Starting dimension[user] with cardinality[5]
2014-03-25 15:09:24,596 INFO [task-runner-0] io.druid.segment.IndexMerger - Completed dimension[user] in 3 millis.
2014-03-25 15:09:24,596 INFO [task-runner-0] io.druid.segment.IndexMerger - outDir[/tmp/persistent/task/index_wikipedia_2014-03-25T15:09:19.063Z/work/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z_0/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z/spill0/v8-tmp] completed inverted.drd in 33 millis.
2014-03-25 15:09:24,608 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Converting v8[/tmp/persistent/task/index_wikipedia_2014-03-25T15:09:19.063Z/work/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z_0/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z/spill0/v8-tmp] to v9[/tmp/persistent/task/index_wikipedia_2014-03-25T15:09:19.063Z/work/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z_0/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z/spill0]
2014-03-25 15:09:24,611 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_anonymous.drd]
2014-03-25 15:09:24,614 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[anonymous] is single value, converting...
2014-03-25 15:09:24,624 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_city.drd]
2014-03-25 15:09:24,625 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[city] is single value, converting...
2014-03-25 15:09:24,625 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_continent.drd]
2014-03-25 15:09:24,625 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[continent] is single value, converting...
2014-03-25 15:09:24,626 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_country.drd]
2014-03-25 15:09:24,626 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[country] is single value, converting...
2014-03-25 15:09:24,627 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_language.drd]
2014-03-25 15:09:24,627 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[language] is single value, converting...
2014-03-25 15:09:24,627 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_namespace.drd]
2014-03-25 15:09:24,628 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[namespace] is single value, converting...
2014-03-25 15:09:24,628 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_newpage.drd]
2014-03-25 15:09:24,628 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[newpage] is single value, converting...
2014-03-25 15:09:24,629 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_page.drd]
2014-03-25 15:09:24,629 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[page] is single value, converting...
2014-03-25 15:09:24,630 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_region.drd]
2014-03-25 15:09:24,630 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[region] is single value, converting...
2014-03-25 15:09:24,630 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_robot.drd]
2014-03-25 15:09:24,631 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[robot] is single value, converting...
2014-03-25 15:09:24,631 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_unpatrolled.drd]
2014-03-25 15:09:24,632 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[unpatrolled] is single value, converting...
2014-03-25 15:09:24,632 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[dim_user.drd]
2014-03-25 15:09:24,632 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Dimension[user] is single value, converting...
2014-03-25 15:09:24,633 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[index.drd]
2014-03-25 15:09:24,633 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[inverted.drd]
2014-03-25 15:09:24,633 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[met_added_LITTLE_ENDIAN.drd]
2014-03-25 15:09:24,637 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[met_count_LITTLE_ENDIAN.drd]
2014-03-25 15:09:24,638 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[met_deleted_LITTLE_ENDIAN.drd]
2014-03-25 15:09:24,638 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[met_delta_LITTLE_ENDIAN.drd]
2014-03-25 15:09:24,639 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[spatial.drd]
2014-03-25 15:09:24,639 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Processing file[time_LITTLE_ENDIAN.drd]
2014-03-25 15:09:24,643 INFO [task-runner-0] io.druid.segment.IndexIO$DefaultIndexIOHandler - Skipped files[[index.drd, inverted.drd, spatial.drd]]
2014-03-25 15:09:24,667 INFO [task-runner-0] io.druid.segment.loading.LocalDataSegmentPusher - Compressing files from[/tmp/persistent/task/index_wikipedia_2014-03-25T15:09:19.063Z/work/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z_0/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z/spill0] to [/tmp/druid/localStorage/wikipedia/2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z/2014-03-25T15:09:19.071Z/0/index.zip]
2014-03-25 15:09:24,669 INFO [task-runner-0] io.druid.utils.CompressionUtils - Adding file[/tmp/persistent/task/index_wikipedia_2014-03-25T15:09:19.063Z/work/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z_0/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z/spill0/meta.smoosh] with size[359].  Total size so far[0]
2014-03-25 15:09:24,670 INFO [task-runner-0] io.druid.utils.CompressionUtils - Adding file[/tmp/persistent/task/index_wikipedia_2014-03-25T15:09:19.063Z/work/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z_0/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z/spill0/version.bin] with size[4].  Total size so far[359]
2014-03-25 15:09:24,670 INFO [task-runner-0] io.druid.utils.CompressionUtils - Adding file[/tmp/persistent/task/index_wikipedia_2014-03-25T15:09:19.063Z/work/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z_0/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z/spill0/00000.smoosh] with size[4,099].  Total size so far[363]
2014-03-25 15:09:24,672 INFO [task-runner-0] io.druid.segment.loading.LocalDataSegmentPusher - Creating descriptor file at[/tmp/druid/localStorage/wikipedia/2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z/2014-03-25T15:09:19.071Z/0/descriptor.json]
2014-03-25 15:09:24,679 INFO [task-runner-0] io.druid.indexing.common.index.YeOldePlumberSchool - Uploaded segment[wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z]
2014-03-25 15:09:24,679 INFO [task-runner-0] io.druid.indexing.common.index.YeOldePlumberSchool - Deleting Index File[/tmp/persistent/task/index_wikipedia_2014-03-25T15:09:19.063Z/work/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z_0/wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-25T15:09:19.071Z/spill0]
2014-03-25 15:09:24,680 INFO [task-runner-0] io.druid.indexing.common.task.IndexTask - Task[index_wikipedia_2014-03-25T15:09:19.063Z] interval[2013-08-31T00:00:00.000Z/2013-09-01T00:00:00.000Z] partition[0] took in 10 rows (10 processed, 0 unparseable, 0 thrown away) and output 5 rows
2014-03-25 15:09:24,687 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Performing action for task[index_wikipedia_2014-03-25T15:09:19.063Z]: SegmentInsertAction{segments=[DataSegment{size=4462, shardSpec=NoneShardSpec, metrics=[count, added, deleted, delta], dimensions=[anonymous, city, continent, country, language, namespace, newpage, page, region, robot, unpatrolled, user], version='2014-03-25T15:09:19.071Z', loadSpec={type=local, path=/tmp/druid/localStorage/wikipedia/2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z/2014-03-25T15:09:19.071Z/0/index.zip}, interval=2013-08-31T00:00:00.000Z/2013-09-01T00:00:00.000Z, dataSource='wikipedia', binaryVersion='9'}]}
2014-03-25 15:09:24,691 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[index_wikipedia_2014-03-25T15:09:19.063Z] to overlord[http://localhost:8087/druid/indexer/v1/action]: SegmentInsertAction{segments=[DataSegment{size=4462, shardSpec=NoneShardSpec, metrics=[count, added, deleted, delta], dimensions=[anonymous, city, continent, country, language, namespace, newpage, page, region, robot, unpatrolled, user], version='2014-03-25T15:09:19.071Z', loadSpec={type=local, path=/tmp/druid/localStorage/wikipedia/2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z/2014-03-25T15:09:19.071Z/0/index.zip}, interval=2013-08-31T00:00:00.000Z/2013-09-01T00:00:00.000Z, dataSource='wikipedia', binaryVersion='9'}]}
2014-03-25 15:09:24,798 WARN [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Exception submitting action for task[index_wikipedia_2014-03-25T15:09:19.063Z]
com.fasterxml.jackson.core.JsonParseException: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
 at [Source: java.io.StringReader@61295616; line: 1, column: 2]
	at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1369)
	at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:532)
	at com.fasterxml.jackson.core.base.ParserMinimalBase._reportUnexpectedChar(ParserMinimalBase.java:453)
	at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._handleUnexpectedValue(ReaderBasedJsonParser.java:1386)
	at com.fasterxml.jackson.core.json.ReaderBasedJsonParser.nextToken(ReaderBasedJsonParser.java:669)
	at com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:2926)
	at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:2873)
	at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2041)
	at io.druid.indexing.common.actions.RemoteTaskActionClient.submit(RemoteTaskActionClient.java:102)
	at io.druid.indexing.common.TaskToolbox.pushSegments(TaskToolbox.java:204)
	at io.druid.indexing.common.task.IndexTask.run(IndexTask.java:168)
	at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:224)
	at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:203)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
	at java.util.concurrent.FutureTask.run(FutureTask.java:166)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:724)

And overload configuration was used default configuration the Druid provided.
The document described(http://druid.io/docs/0.6.73/Tutorial:-The-Druid-Cluster.html) is the local mode? not the remote mode?
So I should run these nodes(real-time, broker...) on the same physical machine?

Thanks,
Tao

Nishant Bangarwa

unread,
Mar 26, 2014, 4:22:13 AM3/26/14
to druid-de...@googlegroups.com
Hi Tao,
the exception in the logs indicates some exception in the overlord while performing segmentInsertAction,
can you also share the logs for overlord, they will be having more details on the exception ?



For more options, visit https://groups.google.com/d/optout.

Fangjin Yang

unread,
Mar 26, 2014, 11:53:26 AM3/26/14
to druid-de...@googlegroups.com
Hi Tao,

I actually made a mistake in my previous post, in that the remote action client logging does occur even running in local mode. The problem you are seeing now is that it appears some error occurred on the overlord (perhaps a misconfigured mysql or some permissions problems), and as Nishant mentioned, the logs on the overlord will give us more insight as to what is happening.
...

tao...@yahoo-inc.com

unread,
Mar 26, 2014, 12:12:12 PM3/26/14
to druid-de...@googlegroups.com
Hi, Fangjin & Nishant:

Here is the overlord log:
2014-03-26 16:04:09,816 INFO [qtp1768045153-24] io.druid.indexing.common.actions.LocalTaskActionClient - Performing action for task[index_wikipedia_2014-03-26T16:04:05.305Z]: SegmentInsertAction{segments=[DataSegment{size=4462, shardSpec=NoneShardSpec, metrics=[count, added, deleted, delta], dimensions=[anonymous, city, continent, country, language, namespace, newpage, page, region, robot, unpatrolled, user], version='2014-03-26T16:04:05.306Z', loadSpec={type=local, path=/tmp/druid/localStorage/wikipedia/2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z/2014-03-26T16:04:05.306Z/0/index.zip}, interval=2013-08-31T00:00:00.000Z/2013-09-01T00:00:00.000Z, dataSource='wikipedia', binaryVersion='9'}]}
Mar 26, 2014 4:04:09 PM com.sun.jersey.spi.container.ContainerResponse mapMappableContainerException
SEVERE: The RuntimeException could not be mapped to a response, re-throwing to the HTTP container
org.skife.jdbi.v2.exceptions.UnableToCloseResourceException: Unable to close Connection
    at org.skife.jdbi.v2.BasicHandle.close(BasicHandle.java:121)
    at org.skife.jdbi.v2.DBI.withHandle(DBI.java:265)
    at org.skife.jdbi.v2.DBI.inTransaction(DBI.java:284)
    at io.druid.indexing.overlord.IndexerDBCoordinator.announceHistoricalSegments(IndexerDBCoordinator.java:145)
    at io.druid.indexing.common.actions.SegmentInsertAction.perform(SegmentInsertAction.java:84)
    at io.druid.indexing.common.actions.SegmentInsertAction.perform(SegmentInsertAction.java:34)
    at io.druid.indexing.common.actions.LocalTaskActionClient.submit(LocalTaskActionClient.java:64)
    at io.druid.indexing.overlord.http.OverlordResource$3.apply(OverlordResource.java:235)
    at io.druid.indexing.overlord.http.OverlordResource$3.apply(OverlordResource.java:224)
    at io.druid.indexing.overlord.http.OverlordResource.asLeaderWith(OverlordResource.java:457)
    at io.druid.indexing.overlord.http.OverlordResource.doAction(OverlordResource.java:221)
    at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
    at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
    at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
    at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
    at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)

    at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
    at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
    at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1511)
    at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1442)
    at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1391)
    at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1381)
    at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416)
    at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:538)
    at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:716)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:848)
    at com.google.inject.servlet.ServletDefinition.doServiceImpl(ServletDefinition.java:278)
    at com.google.inject.servlet.ServletDefinition.doService(ServletDefinition.java:268)
    at com.google.inject.servlet.ServletDefinition.service(ServletDefinition.java:180)
    at com.google.inject.servlet.ManagedServletPipeline.service(ManagedServletPipeline.java:93)
    at com.google.inject.servlet.FilterChainInvocation.doFilter(FilterChainInvocation.java:85)
    at com.google.inject.servlet.ManagedFilterPipeline.dispatch(ManagedFilterPipeline.java:120)
    at com.google.inject.servlet.GuiceFilter$1.call(GuiceFilter.java:132)
    at com.google.inject.servlet.GuiceFilter$1.call(GuiceFilter.java:129)
    at com.google.inject.servlet.GuiceFilter$Context.call(GuiceFilter.java:206)
    at com.google.inject.servlet.GuiceFilter.doFilter(GuiceFilter.java:129)
    at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
    at org.eclipse.jetty.servlets.UserAgentFilter.doFilter(UserAgentFilter.java:82)
    at org.eclipse.jetty.servlets.GzipFilter.doFilter(GzipFilter.java:256)
    at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
    at io.druid.server.http.RedirectFilter.doFilter(RedirectFilter.java:71)
    at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
    at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:455)
    at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:229)
    at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1075)
    at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:384)
    at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)
    at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1009)
    at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
    at org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:52)
    at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
    at org.eclipse.jetty.server.Server.handle(Server.java:370)
    at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:489)
    at org.eclipse.jetty.server.AbstractHttpConnection.content(AbstractHttpConnection.java:960)
    at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:1021)
    at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:865)
    at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:240)
    at org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)
    at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:668)
    at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:52)
    at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
    at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
    at java.lang.Thread.run(Thread.java:724)
Caused by: java.sql.SQLException: Already closed.
    at org.apache.commons.dbcp.PoolableConnection.close(PoolableConnection.java:114)
    at org.apache.commons.dbcp.PoolingDataSource$PoolGuardConnectionWrapper.close(PoolingDataSource.java:191)
    at org.skife.jdbi.v2.BasicHandle.close(BasicHandle.java:118)
    ... 66 more

It seems some issue of mysql.
But I have use these command to give 'druid' user the permission and create the 'druid' database.
GRANT ALL ON druid.* TO 'druid'@'localhost' IDENTIFIED BY 'diurd';
CREATE database druid;

Thanks,
Tao

Fangjin Yang

unread,
Mar 26, 2014, 12:14:10 PM3/26/14
to druid-de...@googlegroups.com
Hi Tao,

One quick check, can you add -Ddruid.db.connector.useValidationQuery=true to your overlord config?
...

tao...@yahoo-inc.com

unread,
Mar 26, 2014, 12:29:35 PM3/26/14
to druid-de...@googlegroups.com
Hi, Fangjin:

Add 
The error log was changed:
Caused by: org.skife.jdbi.v2.exceptions.UnableToExecuteStatementException: com.mysql.jdbc.MysqlDataTruncation: Data truncation: Incorrect date value: '2014-03-26T16:21:36.606Z' for column     'created_date' at row 1 [statement:"INSERT INTO druid_segments (id, dataSource, created_date, start, end, partitioned, version, used, payload) VALUES (:id, :dataSource, :created_date, :s    tart, :end, :partitioned, :version, :used, :payload)", located:"INSERT INTO druid_segments (id, dataSource, created_date, start, end, partitioned, version, used, payload) VALUES (:id, :da    taSource, :created_date, :start, :end, :partitioned, :version, :used, :payload)", rewritten:"INSERT INTO druid_segments (id, dataSource, created_date, start, end, partitioned, version, us    ed, payload) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)", arguments:{ positional:{}, named:{created_date:'2014-03-26T16:21:36.606Z',id:'wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z    _2014-03-26T16:21:32.072Z',dataSource:'wikipedia',partitioned:0,start:'2013-08-31T00:00:00.000Z',payload:'{"dataSource":"wikipedia","interval":"2013-08-31T00:00:00.000Z/2013-09-01T00:00:0    0.000Z","version":"2014-03-26T16:21:32.072Z","loadSpec":{"type":"local","path":"/tmp/druid/localStorage/wikipedia/2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z/2014-03-26T16:21:32.072    Z/0/index.zip"},"dimensions":"anonymous,city,continent,country,language,namespace,newpage,page,region,robot,unpatrolled,user","metrics":"count,added,deleted,delta","shardSpec":{"type":"no    ne"},"binaryVersion":9,"size":4462,"identifier":"wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-26T16:21:32.072Z"}',used:true,end:'2013-09-01T00:00:00.000Z',version:'    2014-03-26T16:21:32.072Z'}, finder:[]}]

The 'druid_segments' table was manually created in 'druid' database.
And each fields type are as blow described:
+--------------+---------------+------+-----+---------+-------+
| Field        | Type          | Null | Key | Default | Extra |
+--------------+---------------+------+-----+---------+-------+
| id           | varchar(300)  | NO   | PRI | NULL    |       |
| dataSource   | char(150)     | YES  |     | NULL    |       |
| created_date | date          | YES  |     | NULL    |       |
| start        | date          | YES  |     | NULL    |       |
| end          | date          | YES  |     | NULL    |       |
| partitioned  | int(11)       | YES  |     | NULL    |       |
| version      | varchar(300)  | YES  |     | NULL    |       |
| used         | int(11)       | YES  |     | NULL    |       |
| payload      | varchar(1000) | YES  |     | NULL    |       |
+--------------+---------------+------+-----+---------+-------+

Are there something wrong with field type defined?

Thanks,
Tao 

Nishant Bangarwa

unread,
Mar 26, 2014, 12:36:35 PM3/26/14
to druid-de...@googlegroups.com
Hi Tao,
druid should be able to make the DB tables automatically for you and there is no need to create the table manually,
Can you try working with a clean DB ?

btw the sql schema of the table at my end is -
mysql> describe test_segments;
+--------------+--------------+------+-----+---------+-------+

| Field        | Type         | Null | Key | Default | Extra |
+--------------+--------------+------+-----+---------+-------+
| id           | varchar(255) | NO   | PRI | NULL    |       |
| dataSource   | varchar(255) | NO   | MUL | NULL    |       |
| created_date | tinytext     | NO   |     | NULL    |       |
| start        | tinytext     | NO   |     | NULL    |       |
| end          | tinytext     | NO   |     | NULL    |       |
| partitioned  | tinyint(1)   | NO   |     | NULL    |       |
| version      | tinytext     | NO   |     | NULL    |       |
| used         | tinyint(1)   | NO   | MUL | NULL    |       |
| payload      | longtext     | NO   |     | NULL    |       |
+--------------+--------------+------+-----+---------+-------+



--
You received this message because you are subscribed to the Google Groups "Druid Development" group.
To unsubscribe from this group and stop receiving emails from it, send an email to druid-developm...@googlegroups.com.
To post to this group, send email to druid-de...@googlegroups.com.

Fangjin Yang

unread,
Mar 26, 2014, 12:36:17 PM3/26/14
to druid-de...@googlegroups.com
Ah, can you try removing that table and having Druid automatically create it?

The default creation is :
mysql> describe druid_segments;
+--------------+--------------+------+-----+---------+-------+
| Field        | Type         | Null | Key | Default | Extra |
+--------------+--------------+------+-----+---------+-------+
| id           | varchar(255) | NO   | PRI | NULL    |       |
| dataSource   | varchar(255) | NO   | MUL | NULL    |       |
| created_date | tinytext     | NO   |     | NULL    |       |
| start        | tinytext     | NO   |     | NULL    |       |
| end          | tinytext     | NO   |     | NULL    |       |
| partitioned  | tinyint(1)   | NO   |     | NULL    |       |
| version      | tinytext     | NO   |     | NULL    |       |
| used         | tinyint(1)   | NO   | MUL | NULL    |       |
| payload      | longtext     | NO   |     | NULL    |       |
+--------------+--------------+------+-----+---------+-------+

--
You received this message because you are subscribed to the Google Groups "Druid Development" group.
To unsubscribe from this group and stop receiving emails from it, send an email to druid-developm...@googlegroups.com.
To post to this group, send email to druid-de...@googlegroups.com.

Fangjin Yang

unread,
Mar 26, 2014, 12:37:54 PM3/26/14
to druid-de...@googlegroups.com
I believe the problem is that with your manually created table, 'DATE' in mysql only understands YYYY-MM-DD and not the iso8601 date format that druid uses.


tao...@yahoo-inc.com

unread,
Mar 26, 2014, 1:05:29 PM3/26/14
to druid-de...@googlegroups.com
yes, maybe it is right.
I dropped this table, also have exception.
It seems this table can't be generated automatically.
I remembered why I created it manually. Because I saw the error log about "Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'druid.druid_segments' doesn't exist".


2014-03-26 16:58:43,805 INFO [qtp1966242435-21] io.druid.indexing.overlord.TaskLockbox - Task[index_wikipedia_2014-03-26T16:58:38.580Z] already present in TaskLock[index_wikipedia_2014-03-26T16:58:38.580Z]
2014-03-26 16:58:43,849 INFO [qtp1966242435-28] io.druid.indexing.common.actions.LocalTaskActionClient - Performing action for task[index_wikipedia_2014-03-26T16:58:38.580Z]: LockListAction{}
2014-03-26 16:58:44,124 INFO [qtp1966242435-26] io.druid.indexing.common.actions.LocalTaskActionClient - Performing action for task[index_wikipedia_2014-03-26T16:58:38.580Z]: SegmentInsertAction{segments=[DataSegment{size=4462, shardSpec=NoneShardSpec, metrics=[count, added, deleted, delta], dimensions=[anonymous, city, continent, country, language, namespace, newpage, page, region, robot, unpatrolled, user], version='2014-03-26T16:58:38.587Z', loadSpec={type=local, path=/tmp/druid/localStorage/wikipedia/2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z/2014-03-26T16:58:38.587Z/0/index.zip}, interval=2013-08-31T00:00:00.000Z/2013-09-01T00:00:00.000Z, dataSource='wikipedia', binaryVersion='9'}]}
Mar 26, 2014 4:58:44 PM com.sun.jersey.spi.container.ContainerResponse mapMappableContainerException
SEVERE: The RuntimeException could not be mapped to a response, re-throwing to the HTTP container
org.skife.jdbi.v2.exceptions.CallbackFailedException: org.skife.jdbi.v2.exceptions.UnableToExecuteStatementException: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'druid.druid_segments' doesn't exist [statement:"SELECT id FROM druid_segments WHERE id = :identifier", located:"SELECT id FROM druid_segments WHERE id = :identifier", rewritten:"SELECT id FROM druid_segments WHERE id = ?", arguments:{ positional:{}, named:{identifier:'wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-26T16:58:38.587Z'}, finder:[]}]
    at org.skife.jdbi.v2.DBI.withHandle(DBI.java:262)
    at org.skife.jdbi.v2.DBI.inTransaction(DBI.java:284)
    at io.druid.indexing.overlord.IndexerDBCoordinator.announceHistoricalSegments(IndexerDBCoordinator.java:145)
    at io.druid.indexing.common.actions.SegmentInsertAction.perform(SegmentInsertAction.java:84)
    at io.druid.indexing.common.actions.SegmentInsertAction.perform(SegmentInsertAction.java:34)
    at io.druid.indexing.common.actions.LocalTaskActionClient.submit(LocalTaskActionClient.java:64)
    at io.druid.indexing.overlord.http.OverlordResource$3.apply(OverlordResource.java:235)
    at io.druid.indexing.overlord.http.OverlordResource$3.apply(OverlordResource.java:224)
    at io.druid.indexing.overlord.http.OverlordResource.asLeaderWith(OverlordResource.java:457)
    at io.druid.indexing.overlord.http.OverlordResource.doAction(OverlordResource.java:221)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
Caused by: org.skife.jdbi.v2.exceptions.UnableToExecuteStatementException: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'druid.druid_segments' doesn't exist [statement:"SELECT id FROM druid_segments WHERE id = :identifier", located:"SELECT id FROM druid_segments WHERE id = :identifier", rewritten:"SELECT id FROM druid_segments WHERE id = ?", arguments:{ positional:{}, named:{identifier:'wikipedia_2013-08-31T00:00:00.000Z_2013-09-01T00:00:00.000Z_2014-03-26T16:58:38.587Z'}, finder:[]}]
at org.skife.jdbi.v2.SQLStatement.internalExecute(SQLStatement.java:1306)
    at org.skife.jdbi.v2.Query.fold(Query.java:172)
    at org.skife.jdbi.v2.Query.list(Query.java:84)
    at org.skife.jdbi.v2.Query.list(Query.java:78)
    at io.druid.indexing.overlord.IndexerDBCoordinator.segmentExists(IndexerDBCoordinator.java:222)
    at io.druid.indexing.overlord.IndexerDBCoordinator.announceHistoricalSegment(IndexerDBCoordinator.java:174)
    at io.druid.indexing.overlord.IndexerDBCoordinator.access$200(IndexerDBCoordinator.java:57)
    at io.druid.indexing.overlord.IndexerDBCoordinator$3.inTransaction(IndexerDBCoordinator.java:154)
    at io.druid.indexing.overlord.IndexerDBCoordinator$3.inTransaction(IndexerDBCoordinator.java:147)
    at org.skife.jdbi.v2.BasicHandle.inTransaction(BasicHandle.java:319)
    at org.skife.jdbi.v2.DBI$4.withHandle(DBI.java:287)
    at org.skife.jdbi.v2.DBI.withHandle(DBI.java:259)
    ... 66 more
Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'druid.druid_segments' doesn't exist
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
    at com.mysql.jdbc.Util.getInstance(Util.java:386)
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1052)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3609)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3541)
    at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2002)
    at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2163)
    at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2624)
    at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2127)
    at com.mysql.jdbc.PreparedStatement.execute(PreparedStatement.java:1362)
    at org.apache.commons.dbcp.DelegatingPreparedStatement.execute(DelegatingPreparedStatement.java:172)
    at org.apache.commons.dbcp.DelegatingPreparedStatement.execute(DelegatingPreparedStatement.java:172)
    at org.skife.jdbi.v2.SQLStatement.internalExecute(SQLStatement.java:1300)
    ... 77 more

Thanks,
Tao

tao...@yahoo-inc.com

unread,
Mar 26, 2014, 1:29:41 PM3/26/14
to druid-de...@googlegroups.com
Oh! year!! SUCCESS!
But I created this table manually.

Thanks Fangjin and Nishant very much!!!!! ^_^
...
Reply all
Reply to author
Forward
0 new messages