Re: [druid-dev] Found [0] sinks. minTimestamp [1970-01-01T00:00:00.000Z]

501 views
Skip to first unread message

Nishant Bangarwa

unread,
Oct 6, 2014, 11:11:22 AM10/6/14
to druid-de...@googlegroups.com
Hi, 
I guess either you are not sending any data or it is dropped as the data you are ingesting is outside windowPeriod,
as per your config the windowPeriod is set to 1 hr, any data older than that will be dropped, 
verify that the data you are ingesting is not older than 1 hr and there are no exceptions in storm logs.
you can also look into any issues with clock of timezone causing this. 

On Mon, Oct 6, 2014 at 8:24 PM, <mar.s...@tuguu.com> wrote:
Hi!

I've got an issue here, don't really know what could it be, seems like Druid can't find my data, when I send a Task.
I'm sending to RT using Tranquility (on Storm)
The task is in pastebin: http://pastebin.com/sJQ3fX9r

Could you please, take a look at the log of druid's task's? I'm worry about Found [0] sinks. minTimestamp [1970-01-01T00:00:00.000Z]:
2014-10-06 14:28:55,083 INFO [task-runner-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Running task: index_realtime_ttb_advertiser_csv_2014-10-06T15:28:00.000+01:00_0_0_mcfnknhj
2014-10-06 14:28:55,084 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Performing action for task[index_realtime_ttb_advertiser_csv_2014-10-06T15:28:00.000+01:00_0_0_mcfnknhj]: LockListAction{}
2014-10-06 14:28:55,090 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[index_realtime_ttb_advertiser_csv_2014-10-06T15:28:00.000+01:00_0_0_mcfnknhj] to overlord[http://druid-historic01-dev.local.tuguu.com:8087/druid/indexer/v1/action]: LockListAction{}
2014-10-06 14:28:55,099 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid-historic01-dev.local.tuguu.com:8087
2014-10-06 14:28:55,122 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid-historic01-dev.local.tuguu.com:8087
2014-10-06 14:28:55,122 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid-historic01-dev.local.tuguu.com:8087
2014-10-06 14:28:55,123 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid-historic01-dev.local.tuguu.com:8087
2014-10-06 14:28:55,123 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid-historic01-dev.local.tuguu.com:8087
2014-10-06 14:28:55,158 INFO [task-runner-0] io.druid.segment.realtime.firehose.EventReceiverFirehoseFactory - Connecting firehose: druid:local:firehose:ttb_advertiser_csv-28-0000-0000
2014-10-06 14:28:55,159 INFO [task-runner-0] io.druid.segment.realtime.firehose.EventReceiverFirehoseFactory - Found chathandler of class[io.druid.segment.realtime.firehose.NoopChatHandlerProvider]
2014-10-06 14:28:55,160 INFO [task-runner-0] io.druid.data.input.FirehoseFactory - Firehose created, will shut down at: 2014-10-06T16:34:00.000+01:00
2014-10-06 14:28:55,166 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Creating plumber using rejectionPolicy[io.druid.segment.realtime.plumber.NoopRejectionPolicyFactory$1@400621ae]
2014-10-06 14:28:55,169 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Expect to run at [2014-10-06T15:29:00.000Z]
2014-10-06 14:28:55,171 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Starting merge and push.
2014-10-06 14:28:55,171 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks. minTimestamp [1970-01-01T00:00:00.000Z]
2014-10-06 14:28:55,171 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks to persist and merge
2014-10-06 14:29:55,181 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-06T14:29:55.173Z","service":"overlord","host":"druid-historic01-dev.local.tuguu.com:8088","metric":"events/thrownAway","value":0,"user2":"ttb_advertiser_csv"}]
2014-10-06 14:29:55,181 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-06T14:29:55.181Z","service":"overlord","host":"druid-historic01-dev.local.tuguu.com:8088","metric":"events/unparseable","value":0,"user2":"ttb_advertiser_csv"}]
2014-10-06 14:29:55,181 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-06T14:29:55.181Z","service":"overlord","host":"druid-historic01-dev.local.tuguu.com:8088","metric":"events/processed","value":0,"user2":"ttb_advertiser_csv"}]
2014-10-06 14:29:55,182 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-06T14:29:55.181Z","service":"overlord","host":"druid-historic01-dev.local.tuguu.com:8088","metric":"rows/output","value":0,"user2":"ttb_advertiser_csv"}]
..... (etc... for each minute)

After this ends, I don't have any new data in Druid, but, always, I'm getting "status" : "SUCCESS"

--
You received this message because you are subscribed to the Google Groups "Druid Development" group.
To unsubscribe from this group and stop receiving emails from it, send an email to druid-developm...@googlegroups.com.
To post to this group, send email to druid-de...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/druid-development/5f8c7b7a-4537-459f-a803-0e6ee0b918b6%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.



--

Fangjin Yang

unread,
Oct 6, 2014, 4:19:11 PM10/6/14
to druid-de...@googlegroups.com
There are a few problems with configuration based on chats in IRC, and they come from the fact that the default overlord settings out of the box won't just work.

What is required are the following:
1) 
druid.indexer.task.chathandler.type=announce
druid.selectors.indexing.serviceName=overlord
2) 
In addition, as ingestion volumes increase, you will have to consider setting:
druid.indexer.runner.javaOpts to give more resources to peons
or run the indexing service in remote mode.
To unsubscribe from this group and stop receiving emails from it, send an email to druid-development+unsubscribe@googlegroups.com.
To post to this group, send email to druid-development@googlegroups.com.
Nishant
Software Engineer|METAMARKETS
+91-9729200044

mar.s...@tuguu.com

unread,
Oct 7, 2014, 6:47:57 AM10/7/14
to druid-de...@googlegroups.com
Thank you for quick replies, did everything I could, but still got some problems..

Here is part of code, you can see, I push to each event current jodas DateTime in testing purposes and have windowPeriod is set to 10M

            final DruidBeams.Builder<String> builder = DruidBeams
                    .builder(
                            new Timestamper<String>() {
                                @Override
                                public DateTime timestamp(String theMap) {
                                    return new DateTime();
                                }
                            }
                    )
                    .curator(curator)
                    .discoveryPath("/druid/discovery")
                    .location(
                            new DruidLocation(
                                    new DruidEnvironment(
                                            "overlord",
                                            "druid:local:firehose:%s"
                                    ), dataSource
                            )
                    )
                    .rollup(DruidRollup.create(dimensions, aggregators, QueryGranularity.NONE))
                    .tuning(ClusteredBeamTuning.create(Granularity.MINUTE, new Period("PT0H0M0S"), new Period("PT10M"), 1, 1));

            final Beam<String> beam = builder.buildBeam();


1) New parameters are in config:
druid.indexer.task.chathandler.type=announce
druid.selectors.indexing.serviceName=overlord

Gian Merlino

unread,
Oct 7, 2014, 1:25:02 PM10/7/14
to druid-de...@googlegroups.com
It looks like tranquility is not able to find the druid indexing task. I see you have already shared your tranquility config (the code) but can you please also share your runtime.properties for druid indexing? It's possible something is not in sync between them.

mar.s...@tuguu.com

unread,
Oct 8, 2014, 5:45:20 AM10/8/14
to druid-de...@googlegroups.com
Yes, of course, thanx for taking a look

druid.host=druid-historic01-dev
druid
.port=8087
druid
.service=overlord

druid
.zk.service.host=kafka01-dev:2181

druid
.db.connector.connectURI=jdbc:mysql://crud-mysql01-dev:3306/druid
druid
.db.connector.user=druid
druid
.db.connector.password=****

druid
.selectors.indexing.serviceName=overlord
druid
.indexer.runner.javaOpts="-server -Xmx1g"
druid
.indexer.runner.startPort=8088
druid
.indexer.fork.property.druid.computation.buffer.size=268435456


mar.s...@tuguu.com

unread,
Oct 8, 2014, 7:03:00 AM10/8/14
to druid-de...@googlegroups.com
is it possible  to debug / dump the data before tranquility send it to druid, to convert the string to map in same time? it seems expecting a map, instead of a string
Oct 08, 2014 9:41:51 AM com.sun.jersey.spi.container.ContainerResponse mapMappableContainerException
SEVERE
: The exception contained within MappableContainerException could not be mapped to a response, re-throwing to the HTTP container
com
.fasterxml.jackson.databind.JsonMappingException: Can not instantiate value of type [map type; class java.util.LinkedHashMap, [simple type, class java.lang.String] -> [simple type, class java.lang.Object]] from String value; no single-String constructor/factory method
full log with the task: http://pastebin.com/G04H5sTp


Gian Merlino

unread,
Oct 8, 2014, 12:04:59 PM10/8/14
to druid-de...@googlegroups.com
Ah okay I see. I didn't notice you were sending a String directly. Tranquility sends your objects to Druid in batches, and attempts to serializes each batch as an array of JSON objects. By default it uses Jackson to do this. Since your object type is just String, they will actually be serialized to an array of JSON strings, which causes the confusion on the Druid server that you are seeing. (Druid expects an array of objects so it can convert them to Maps.)

You have a few possible solutions:

- Use a data type other than String; something that can be serialized using Jackson into a JSON object. Any POJO should do the trick, as should something like Map<String, Object> where the values are simple types like String, Long, Double, and so on.

- You can provide an implementation of the JsonWriter interface to the DruidBeams builder (through .eventWriter), which will teach Tranquility how to convert your Strings into JSON objects. You have to implement one method: viaJsonGenerator(T, JsonGenerator), which will get your string and a JsonGenerator and should call methods on that JsonGenerator.

- The currently stable version of tranquility doesn't have this capability, but in tranquility 0.2.22+ you can alternatively provide an implementation of ObjectWriter or JavaObjectWriter (through .objectWriter) as a serialization method. The interface you need to implement there is batchAsBytes(Iterator[T]), which will get an Iterator of a batch of your strings and needs to generate a JSON array. If the strings are already serialized JSON objects, this will probably be easier for you, since you can just join them with ","s and surround them with "[" and "]".

mar.s...@tuguu.com

unread,
Oct 8, 2014, 12:05:35 PM10/8/14
to druid-de...@googlegroups.com
Fixed mapMappableContainerException...


In the middle of nothing, I had an error, in Druid's log 
Oct 08, 2014 11:33:07 AM com.sun.jersey.spi.container.ContainerResponse mapMappableContainerException
SEVERE: The RuntimeException could not be mapped to a response, re-throwing to the HTTP container
com.metamx.common.parsers.ParseException: Unparseable timestamp found!
	at io.druid.data.input.impl.MapInputRowParser.parse(MapInputRowParser.java:76)
	at io.druid.segment.realtime.firehose.EventReceiverFirehoseFactory$EventReceiverFirehose.addAll(EventReceiverFirehoseFactory.java:148)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
	at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
	at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
	at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
	at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
	at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1511)
	at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1442)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1391)
	at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1381)
	at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:538)
	at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:716)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
	at com.google.inject.servlet.ServletDefinition.doServiceImpl(ServletDefinition.java:278)
	at com.google.inject.servlet.ServletDefinition.doService(ServletDefinition.java:268)
	at com.google.inject.servlet.ServletDefinition.service(ServletDefinition.java:180)
	at com.google.inject.servlet.ManagedServletPipeline.service(ManagedServletPipeline.java:93)
	at com.google.inject.servlet.FilterChainInvocation.doFilter(FilterChainInvocation.java:85)
	at com.google.inject.servlet.ManagedFilterPipeline.dispatch(ManagedFilterPipeline.java:120)
	at com.google.inject.servlet.GuiceFilter$1.call(GuiceFilter.java:132)
	at com.google.inject.servlet.GuiceFilter$1.call(GuiceFilter.java:129)
	at com.google.inject.servlet.GuiceFilter$Context.call(GuiceFilter.java:206)
	at com.google.inject.servlet.GuiceFilter.doFilter(GuiceFilter.java:129)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1650)
	at org.eclipse.jetty.servlets.UserAgentFilter.doFilter(UserAgentFilter.java:83)
	at org.eclipse.jetty.servlets.GzipFilter.doFilter(GzipFilter.java:300)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1650)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:583)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:221)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1125)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1059)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:52)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
	at org.eclipse.jetty.server.Server.handle(Server.java:485)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:290)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:248)
	at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:540)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:606)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:535)
	at java.lang.Thread.run(Thread.java:744)
Caused by: java.lang.NullPointerException: Null timestamp in input: {installs=9, created_at=2014-10-08T12:30:50Z, country_id=mar, promo_id=mar, toolbar_id=mar, affiliat...
	at io.druid.data.input.impl.MapInputRowParser.parse(MapInputRowParser.java:67)
	... 53 more
Now at least I see that Druid know's about my data, I'm pushing to him all the time...{installs=9, created_at=2014-10-08T12:30:50Z, country_id=mar, promo_id=mar, toolbar_id=mar, affiliat...
When I was playing with timestamp, I fixed it, and now I don't have any errors... But still getting Zero 0 data in druid... why? ... 

2014-10-08 11:44:00,003 INFO [crud_historic_toolbars-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Starting merge and push.
2014-10-08 11:44:00,003 INFO [crud_historic_toolbars-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks. minTimestamp [1970-01-01T00:00:00.000Z]
2014-10-08 11:44:00,003 INFO [crud_historic_toolbars-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks to persist and merge
2014-10-08 11:44:06,963 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-08T11:44:06.962Z","service":"overlord","host":"druid-historic01-dev.local.tuguu.com:8088","metric":"events/thrownAway","value":0,"user2":"crud_historic_toolbars"}]
2014-10-08 11:44:06,963 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-08T11:44:06.963Z","service":"overlord","host":"druid-historic01-dev.local.tuguu.com:8088","metric":"events/unparseable","value":0,"user2":"crud_historic_toolbars"}]
2014-10-08 11:44:06,963 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-08T11:44:06.963Z","service":"overlord","host":"druid-historic01-dev.local.tuguu.com:8088","metric":"events/processed","value":0,"user2":"crud_historic_toolbars"}]
2014-10-08 11:44:06,964 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-08T11:44:06.963Z","service":"overlord","host":"druid-historic01-dev.local.tuguu.com:8088","metric":"rows/output","value":0,"user2":"crud_historic_toolbars"}]


Ow, I found an error during starting Zookeeper, don't know how bad it is and how long would I see it...
15047 [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:2000] WARN org.apache.zookeeper.server.NIOServerCnxn - caught end of stream exception
org.apache.zookeeper.server.ServerCnxn$EndOfStreamException: Unable to read additional data from client sessionid 0x148f069a6380002, likely client has closed socket
at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:220) ~[zookeeper-3.4.5.jar:3.4.5-1392090]
at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:208) [zookeeper-3.4.5.jar:3.4.5-1392090]
at java.lang.Thread.run(Thread.java:745) [na:1.7.0_65]

Sometimes getting this error, if I have lot's of data in Kinesis, I think it's beacause of low memory, or missing Middle Managers, so I don't take care of this error:
52230 [New I/O worker #1] WARN com.metamx.tranquility.finagle.FutureRetry$ - Transient error, will try again in 7741 ms
java.io.IOException: Unable to push events to task: index_realtime_crud_historic_toolbars_2014-10-08T15:54:00.000Z_0_0_dbhfpihb (status = TaskRunning)
at com.metamx.tranquility.druid.DruidBeam$$anonfun$4$$anonfun$apply$4$$anonfun$apply$6$$anonfun$apply$7$$anonfun$apply$3$$anonfun$applyOrElse$2.apply(DruidBeam.scala:149) ~[tranquility_2.10-0.2.16.jar:0.2.16]
at com.metamx.tranquility.druid.DruidBeam$$anonfun$4$$anonfun$apply$4$$anonfun$apply$6$$anonfun$apply$7$$anonfun$apply$3$$anonfun$applyOrElse$2.apply(DruidBeam.scala:135) ~[tranquility_2.10-0.2.16.jar:0.2.16]


Gian Merlino

unread,
Oct 8, 2014, 12:17:46 PM10/8/14
to druid-de...@googlegroups.com
If your timestamp is in "created_at" then your timestampSpec needs to reflect that. The default is that timestamp is in "timestamp" and the format is "iso". You can override it by passing a .timestampSpec(...) to the DruidBeams builder.

It does look like your format is ISO, so that part should be okay.

mar.s...@tuguu.com

unread,
Oct 9, 2014, 11:14:34 AM10/9/14
to druid-de...@googlegroups.com
That's it! I've GOT DATA in Druid!!! Thank You All!!

1 . Now the legendary ingestion to Druid is continues to amaze me. 
During first successful task, events did come all the time, so Tranquility prepared new Task, and pushed new events to it.
Logically? Yes, Working? No.... I missed test events, they are lost in the middle of nowhere... 
Take a look at my logs please! Tranqyuility logs:
3565822 [Hashed wheel timer #1] WARN  com.metamx.tranquility.beam.ClusteredBeam - Emitting alert: [anomaly] Failed to propagate events: overlord/crud_historic_toolbars
{
 
"eventCount" : 6,
 
"timestamp" : "2014-10-09T14:21:00.000Z",
 
"beams" : "HashPartitionBeam(DruidBeam(timestamp = 2014-10-09T15:21:00.000+01:00, partition = 0, tasks = [index_realtime_crud_historic_toolbars_2014-10-09T14:21:00.000Z_0_0_fbflapoc/crud_historic_toolbars-21-0000-0000]))"
}
com
.twitter.finagle.GlobalRequestTimeoutException: exceeded 1.minutes+30.seconds to druid:local:firehose:crud_historic_toolbars-21-0000-0000 while waiting for a response for the request, including retries (if applicable)
        at com
.twitter.finagle.NoStacktrace(Unknown Source) ~[na:na]
3565823 [Hashed wheel timer #1] INFO  com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"alerts","timestamp":"2014-10-09T15:24:11.989+01:00","service":"tranquility","host":"localhost","severity":"anomaly","description":"Failed to propagate events: overlord/crud_historic_toolbars","data":{"exceptionType":"com.twitter.finagle.GlobalRequestTimeoutException","exceptionStackTrace":"com.twitter.finagle.GlobalRequestTimeoutException: exceeded 1.minutes+30.seconds to druid:local:firehose:crud_historic_toolbars-21-0000-0000 while waiting for a response for the request, including retries (if applicable)\n\tat com.twitter.finagle.NoStacktrace(Unknown Source)\n","timestamp":"2014-10-09T14:21:00.000Z","beams":"HashPartitionBeam(DruidBeam(timestamp = 2014-10-09T15:21:00.000+01:00, partition = 0, tasks = [index_realtime_crud_historic_toolbars_2014-10-09T14:21:00.000Z_0_0_fbflapoc/crud_historic_toolbars-21-0000-0000]))","eventCount":6,"exceptionMessage":"exceeded 1.minutes+30.seconds to druid:local:firehose:crud_historic_toolbars-21-0000-0000 while waiting for a response for the request, including retries (if applicable)"}}]
3565823 [BeamBolt-Emitter-tranquility-0] INFO  com.metamx.tranquility.storm.BeamBolt - Sent 0, ignored 6 queued events.


Maybe I'm missing events because of rejectionPolicyFactory:"none"

The Druid's EMPTY task's log:
2014-10-09 14:36:16,518 INFO [main] io.druid.indexing.worker.executor.ExecutorLifecycle - Running with task: {
 
"type" : "index_realtime",
 
"id" : "index_realtime_crud_historic_toolbars_2014-10-09T14:21:00.000Z_0_0_fbflapoc",
 
"resource" : {
   
"availabilityGroup" : "crud_historic_toolbars-21-0000",
   
"requiredCapacity" : 1
 
},
 
"spec" : {
   
"dataSchema" : {
     
"dataSource" : "crud_historic_toolbars",
     
"parser" : {
       
"type" : "map",
       
"parseSpec" : {
         
"format" : "json",
         
"timestampSpec" : {
           
"column" : "created_at",
           
"format" : "iso"
         
},
         
"dimensionsSpec" : {
           
"dimensions" : [ "affiliate_id", "country_id", "promo_id", "app", "traffic_source_id", "toolbar_id", "channel_id", "advertiser_channel", "created_at" ],
           
"dimensionExclusions" : [ ],
           
"spatialDimensions" : [ ]
         
}
       
}
     
},
     
"metricsSpec" : [ {
       
"type" : "doubleSum",
       
"name" : "installs",
       
"fieldName" : "installs"
     
}, {
       
"type" : "doubleSum",
       
"name" : "error_90",
       
"fieldName" : "error_90"
     
} ],
     
"granularitySpec" : {
       
"type" : "uniform",
       
"segmentGranularity" : "MINUTE",
       
"queryGranularity" : {
         
"type" : "none"
       
},
       
"intervals" : null
     
}
   
},
   
"ioConfig" : {
     
"type" : "realtime",
     
"firehose" : {
       
"type" : "clipped",
       
"delegate" : {
         
"type" : "timed",
         
"delegate" : {
           
"type" : "receiver",
           
"serviceName" : "druid:local:firehose:crud_historic_toolbars-21-0000-0000",
           
"bufferSize" : 100000,
           
"parser" : {
             
"type" : "map",
             
"parseSpec" : {
               
"format" : "json",
               
"timestampSpec" : {
                 
"column" : "created_at",
                 
"format" : "iso"
               
},
               
"dimensionsSpec" : {
                 
"dimensions" : [ "affiliate_id", "country_id", "promo_id", "app", "traffic_source_id", "toolbar_id", "channel_id", "advertiser_channel", "created_at" ],
                 
"dimensionExclusions" : [ ],
                 
"spatialDimensions" : [ ]
               
}
             
}
           
}
         
},
         
"shutoffTime" : "2014-10-09T14:37:00.000Z"
       
},
       
"interval" : "2014-10-09T14:21:00.000Z/2014-10-09T14:22:00.000Z"
     
}
   
},
   
"tuningConfig" : {
     
"type" : "realtime",
     
"maxRowsInMemory" : 75000,
     
"intermediatePersistPeriod" : "PT10M",
     
"windowPeriod" : "PT10M",
     
"basePersistDirectory" : "/tmp/1412861148060-0",
     
"versioningPolicy" : {
       
"type" : "intervalStart"
     
},
     
"maxPendingPersists" : 1,
     
"shardSpec" : {
       
"type" : "linear",
       
"partitionNum" : 0
     
},
     
"rejectionPolicyFactory" : {
       
"type" : "none"
     
}
   
}
 
},
 
"groupId" : "index_realtime_crud_historic_toolbars",
 
"dataSource" : "crud_historic_toolbars"
}
2014-10-09 14:36:16,526 INFO [task-runner-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Running task: index_realtime_crud_historic_toolbars_2014-10-09T14:21:00.000Z_0_0_fbflapoc
2014-10-09 14:36:16,527 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Performing action for task[index_realtime_crud_historic_toolbars_2014-10-09T14:21:00.000Z_0_0_fbflapoc]: LockListAction{}
2014-10-09 14:36:16,532 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[index_realtime_crud_historic_toolbars_2014-10-09T14:21:00.000Z_0_0_fbflapoc] to overlord[http://druid-historic01-dev:8087/druid/indexer/v1/action]: LockListAction{}
2014-10-09 14:36:16,541 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid-historic01-dev:8087
2014-10-09 14:36:16,563 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid-historic01-dev:8087
2014-10-09 14:36:16,563 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid-historic01-dev:8087
2014-10-09 14:36:16,563 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid-historic01-dev:8087
2014-10-09 14:36:16,563 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid-historic01-dev:8087
2014-10-09 14:36:16,597 INFO [task-runner-0] io.druid.segment.realtime.firehose.EventReceiverFirehoseFactory - Connecting firehose: druid:local:firehose:crud_historic_toolbars-21-0000-0000
2014-10-09 14:36:16,598 INFO [task-runner-0] io.druid.segment.realtime.firehose.EventReceiverFirehoseFactory - Found chathandler of class[io.druid.segment.realtime.firehose.ServiceAnnouncingChatHandlerProvider]
2014-10-09 14:36:16,598 INFO [task-runner-0] io.druid.segment.realtime.firehose.ServiceAnnouncingChatHandlerProvider - Registering Eventhandler[druid:local:firehose:crud_historic_toolbars-21-0000-0000]
2014-10-09 14:36:16,600 INFO [task-runner-0] io.druid.curator.discovery.CuratorServiceAnnouncer - Announcing service[DruidNode{serviceName='druid:local:firehose:crud_historic_toolbars-21-0000-0000', host='druid-historic01-dev:8088', port=8088}]
2014-10-09 14:36:16,629 INFO [task-runner-0] io.druid.segment.realtime.firehose.ServiceAnnouncingChatHandlerProvider - Registering Eventhandler[crud_historic_toolbars-21-0000-0000]
2014-10-09 14:36:16,630 INFO [task-runner-0] io.druid.curator.discovery.CuratorServiceAnnouncer - Announcing service[DruidNode{serviceName='crud_historic_toolbars-21-0000-0000', host='druid-historic01-dev:8088', port=8088}]
2014-10-09 14:36:17,247 INFO [task-runner-0] io.druid.data.input.FirehoseFactory - Firehose created, will shut down at: 2014-10-09T14:37:00.000Z
2014-10-09 14:36:17,253 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Creating plumber using rejectionPolicy[io.druid.segment.realtime.plumber.NoopRejectionPolicyFactory$1@71cd8a9f]
2014-10-09 14:36:17,256 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Expect to run at [2014-10-09T14:47:00.000Z]
2014-10-09 14:36:17,258 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Starting merge and push.
2014-10-09 14:36:17,258 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks. minTimestamp [1970-01-01T00:00:00.000Z]
2014-10-09 14:36:17,258 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks to persist and merge
2014-10-09 14:37:00,000 INFO [timed-shutoff-firehose-0] io.druid.data.input.FirehoseFactory - Closing delegate firehose.
2014-10-09 14:37:00,001 INFO [timed-shutoff-firehose-0] io.druid.segment.realtime.firehose.EventReceiverFirehoseFactory - Firehose closing.
2014-10-09 14:37:00,001 INFO [timed-shutoff-firehose-0] io.druid.segment.realtime.firehose.ServiceAnnouncingChatHandlerProvider - Unregistering chat handler[druid:local:firehose:crud_historic_toolbars-21-0000-0000]
2014-10-09 14:37:00,001 INFO [timed-shutoff-firehose-0] io.druid.curator.discovery.CuratorServiceAnnouncer - Unannouncing service[DruidNode{serviceName='druid:local:firehose:crud_historic_toolbars-21-0000-0000', host='druid-historic01-dev:8088', port=8088}]
2014-10-09 14:37:00,269 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Submitting persist runnable for dataSource[crud_historic_toolbars]
2014-10-09 14:37:00,270 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Shutting down...
2014-10-09 14:37:00,271 INFO [task-runner-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Removing task directory: /tmp/persistent/task/index_realtime_crud_historic_toolbars_2014-10-09T14:21:00.000Z_0_0_fbflapoc/work
2014-10-09 14:37:00,281 INFO [task-runner-0] io.druid.indexing.worker.executor.ExecutorLifecycle - Task completed with status: {
 
"id" : "index_realtime_crud_historic_toolbars_2014-10-09T14:21:00.000Z_0_0_fbflapoc",
 
"status" : "SUCCESS",
 
"duration" : 43750
}


2. Help me, please, configure tasks to infinitely add data to Druid, without delays, to catch on result every change /every minute/!
 * 2.1 Is it possible to decrease the default value of intermediatePersistPeriod or this is set to 10 minutes, because overlord can't handle one Task less than this time? I tried to decrease processing time, but intermediatePersistPeriod[PT10M] should be always equal or greater than windowPeriod, so I have to wait minimum 20 minutes before next Task?

mar.s...@tuguu.com

unread,
Oct 10, 2014, 10:30:43 AM10/10/14
to druid-de...@googlegroups.com
Help! Still getting missing events. 

From the scratch. Checked that Druid is clean,. no Tasks presented, so I'm getting new event Tranquility. (successfully passed to druid)
  {app=mar, toolbar_id=mar, affiliate_id=mar, error_90=1, installs=2, created_at=2014-10-10T15:09:50.521404+01:00, country_id=mar, promo_id=mar, channel_id=mar, traffic_source_id=mar, advertiser_channel=mar}

204562 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.tranquility.beam.ClusteredBeam - Creating new beams for identifier[overlord/crud_historic_toolbars] timestamp[2014-10-10T14:09:00.000Z] (target = 1, actual = 0)
204565 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.common.scala.control$ - Creating druid indexing task with id: index_realtime_crud_historic_toolbars_2014-10-10T14:09:00.000Z_0_0_pfcmaagj (service = overlord)
204620 [New I/O worker #1] INFO  com.metamx.common.scala.control$ - Created druid indexing task with id: index_realtime_crud_historic_toolbars_2014-10-10T14:09:00.000Z_0_0_pfcmaagj (service = overlord)
204635 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.common.scala.net.finagle.DiscoResolver - Updating instances for service[druid:local:firehose:crud_historic_toolbars-09-0000-0000] to Set()
204647 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] WARN  finagle - Name resolution is pending
204650 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.tranquility.finagle.FinagleRegistry - Created client for service: druid:local:firehose:crud_historic_toolbars-09-0000-0000
204652 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.tranquility.beam.ClusteredBeam - Created beam: {"timestamp":"2014-10-10T14:09:00.000Z","partition":0,"tasks":[{"id":"index_realtime_crud_historic_toolbars_2014-10-10T14:09:00.000Z_0_0_pfcmaagj","firehoseId":"crud_historic_toolbars-09-0000-0000"}]}
204654 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.tranquility.druid.DruidBeam - Closing Druid beam for datasource[crud_historic_toolbars] timestamp[2014-10-10T14:09:00.000Z] (tasks = index_realtime_crud_historic_toolbars_2014-10-10T14:09:00.000Z_0_0_pfcmaagj)
204655 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.tranquility.finagle.FinagleRegistry - Closing client for service: druid:local:firehose:crud_historic_toolbars-09-0000-0000
204658 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.common.scala.net.finagle.DiscoResolver - No longer monitoring service[druid:local:firehose:crud_historic_toolbars-09-0000-0000]
204687 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.tranquility.beam.ClusteredBeam - Writing new beam data to[/tranquility/beams/overlord/crud_historic_toolbars/data]: {"latestTime":"2014-10-10T14:09:00.000Z","latestCloseTime":"2014-10-10T15:02:00.000+01:00","beams":{"2014-10-10T15:04:00.000+01:00":[{"timestamp":"2014-10-10T14:04:00.000Z","partition":0,"tasks":[{"id":"index_realtime_crud_historic_toolbars_2014-10-10T14:04:00.000Z_0_0_nokafkno","firehoseId":"crud_historic_toolbars-04-0000-0000"}]}],"2014-10-10T14:09:00.000Z":[{"timestamp":"2014-10-10T14:09:00.000Z","partition":0,"tasks":[{"id":"index_realtime_crud_historic_toolbars_2014-10-10T14:09:00.000Z_0_0_pfcmaagj","firehoseId":"crud_historic_toolbars-09-0000-0000"}]}]}}
208217 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.tranquility.beam.ClusteredBeam - Adding beams for identifier[overlord/crud_historic_toolbars] timestamp[2014-10-10T14:09:00.000Z]: List(Map(timestamp -> 2014-10-10T14:09:00.000Z, partition -> 0, tasks -> Vector(Map(id -> index_realtime_crud_historic_toolbars_2014-10-10T14:09:00.000Z_0_0_pfcmaagj, firehoseId -> crud_historic_toolbars-09-0000-0000))))
208431 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.common.scala.net.finagle.DiscoResolver - Updating instances for service[druid:local:firehose:crud_historic_toolbars-09-0000-0000] to Set()
208434 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] WARN  finagle - Name resolution is pending
208438 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.tranquility.finagle.FinagleRegistry - Created client for service: druid:local:firehose:crud_historic_toolbars-09-0000-0000
208444 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.tranquility.beam.ClusteredBeam - Removing beams for identifier[overlord/crud_historic_toolbars] timestamp[2014-10-10T14:04:00.000Z]
208450 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.tranquility.druid.DruidBeam - Closing Druid beam for datasource[crud_historic_toolbars] timestamp[2014-10-10T15:04:00.000+01:00] (tasks = index_realtime_crud_historic_toolbars_2014-10-10T14:04:00.000Z_0_0_kbodacoo)
208455 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.tranquility.finagle.FinagleRegistry - Closing client for service: druid:local:firehose:crud_historic_toolbars-04-0000-0000
208579 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.common.scala.net.finagle.DiscoResolver - No longer monitoring service[druid:local:firehose:crud_historic_toolbars-04-0000-0000]
209445 [ServiceCache-0] INFO  com.metamx.common.scala.net.finagle.DiscoResolver - Updating instances for service[druid:local:firehose:crud_historic_toolbars-09-0000-0000] to Set(ServiceInstance{name='druid:local:firehose:crud_historic_toolbars-09-0000-0000', id='6ef1d9b2-cec2-4748-89d6-36cb632c7012', address='druid-historic01-dev', port=8089, sslPort=null, payload=null, registrationTimeUTC=1412950199365, serviceType=DYNAMIC, uriSpec=null})
209572 [New I/O worker #3] INFO  com.metamx.tranquility.druid.DruidBeam - Propagated 1 events for 2014-10-10T15:09:00.000+01:00 to firehose[crud_historic_toolbars-09-0000-0000], got response: {"eventCount":1}
209575 [BeamBolt-Emitter-tranquility-0] INFO  com.metamx.tranquility.storm.BeamBolt - Sent 1, ignored 0 queued events.
221578 [BeamBolt-Emitter-tranquility-0] INFO  com.metamx.tranquility.storm.BeamBolt - Sending 1 queued events.


Sending new event, unsuccessfully as you would understand:
   
 {app=mar, toolbar_id=mar, affiliate_id=mar, error_90=1, installs=9, created_at=2014-10-10T15:10:07.298782+01:00, country_id=mar, promo_id=mar, channel_id=mar, traffic_source_id=mar, advertiser_channel=mar}

With task:

Running with task: {
 
"type" : "index_realtime",

 
"id" : "index_realtime_crud_historic_toolbars_2014-10-10T14:10:00.000Z_0_0_lhnnedij",
 
"resource" : {
   
"availabilityGroup" : "crud_historic_toolbars-10-0000",

   
"requiredCapacity" : 1
 
},
 
"spec" : {
   
"dataSchema" : {
     
"dataSource" : "crud_historic_toolbars",
     
"parser" : {
       
"type" : "map",
       
"parseSpec" : {
         
"format" : "json",
         
"timestampSpec" : {
           
"column" : "created_at",
           
"format" : "iso"
         
},
         
"dimensionsSpec" : {
           
"dimensions" : [ "affiliate_id", "country_id", "promo_id", "app", "traffic_source_id", "toolbar_id", "channel_id", "advertiser_channel", "created_at" ],

           
"dimensionExclusions" : [ "error_90", "installs" ],

           
"spatialDimensions" : [ ]
         
}
       
}
     
},
     
"metricsSpec" : [ {
       
"type" : "doubleSum",
       
"name" : "installs",
       
"fieldName" : "installs"
     
}, {
       
"type" : "doubleSum",
       
"name" : "error_90",
       
"fieldName" : "error_90"
     
} ],
     
"granularitySpec" : {
       
"type" : "uniform",
       
"segmentGranularity" : "MINUTE",
       
"queryGranularity" : {

         
"type" : "duration",
         
"duration" : 60000,
         
"origin" : "1970-01-01T00:00:00.000Z"

       
},
       
"intervals" : null
     
}
   
},
   
"ioConfig" : {
     
"type" : "realtime",
     
"firehose" : {
       
"type" : "clipped",
       
"delegate" : {
         
"type" : "timed",
         
"delegate" : {
           
"type" : "receiver",

           
"serviceName" : "druid:local:firehose:crud_historic_toolbars-10-0000-0000",

           
"bufferSize" : 100000,
           
"parser" : {
             
"type" : "map",
             
"parseSpec" : {
               
"format" : "json",
               
"timestampSpec" : {
                 
"column" : "created_at",
                 
"format" : "iso"
               
},
               
"dimensionsSpec" : {
                 
"dimensions" : [ "affiliate_id", "country_id", "promo_id", "app", "traffic_source_id", "toolbar_id", "channel_id", "advertiser_channel", "created_at" ],

                 
"dimensionExclusions" : [ "error_90", "installs" ],
                 
"spatialDimensions" : [ ]
               
}
             
}
           
}
         
},
         
"shutoffTime" : "2014-10-10T14:21:00.000Z"
       
},
       
"interval" : "2014-10-10T14:10:00.000Z/2014-10-10T14:11:00.000Z"

     
}
   
},
   
"tuningConfig" : {
     
"type" : "realtime",
     
"maxRowsInMemory" : 75000,

     
"intermediatePersistPeriod" : "PT5M",
     
"windowPeriod" : "PT5M",
     
"basePersistDirectory" : "/tmp/1412950051496-0",

     
"versioningPolicy" : {
       
"type" : "intervalStart"
     
},
     
"maxPendingPersists" : 1,
     
"shardSpec" : {
       
"type" : "linear",
       
"partitionNum" : 0
     
},
     
"rejectionPolicyFactory" : {
       
"type" : "none"
     
}
   
}
 
},
 
"groupId" : "index_realtime_crud_historic_toolbars",
 
"dataSource" : "crud_historic_toolbars"
}


221645 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.tranquility.beam.ClusteredBeam - Creating new beams for identifier[overlord/crud_historic_toolbars] timestamp[2014-10-10T14:10:00.000Z] (target = 1, actual = 0)
221652 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.common.scala.control$ - Creating druid indexing task with id: index_realtime_crud_historic_toolbars_2014-10-10T14:10:00.000Z_0_0_lhnnedij (service = overlord)
221697 [New I/O worker #1] INFO  com.metamx.common.scala.control$ - Created druid indexing task with id: index_realtime_crud_historic_toolbars_2014-10-10T14:10:00.000Z_0_0_lhnnedij (service = overlord)
221719 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.common.scala.net.finagle.DiscoResolver - Updating instances for service[druid:local:firehose:crud_historic_toolbars-10-0000-0000] to Set()
221721 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] WARN  finagle - Name resolution is pending
221726 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.tranquility.finagle.FinagleRegistry - Created client for service: druid:local:firehose:crud_historic_toolbars-10-0000-0000
221733 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.tranquility.beam.ClusteredBeam - Created beam: {"timestamp":"2014-10-10T14:10:00.000Z","partition":0,"tasks":[{"id":"index_realtime_crud_historic_toolbars_2014-10-10T14:10:00.000Z_0_0_lhnnedij","firehoseId":"crud_historic_toolbars-10-0000-0000"}]}
221736 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.tranquility.druid.DruidBeam - Closing Druid beam for datasource[crud_historic_toolbars] timestamp[2014-10-10T14:10:00.000Z] (tasks = index_realtime_crud_historic_toolbars_2014-10-10T14:10:00.000Z_0_0_lhnnedij)
221737 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.tranquility.finagle.FinagleRegistry - Closing client for service: druid:local:firehose:crud_historic_toolbars-10-0000-0000
221739 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.common.scala.net.finagle.DiscoResolver - No longer monitoring service[druid:local:firehose:crud_historic_toolbars-10-0000-0000]
221748 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.tranquility.beam.ClusteredBeam - Writing new beam data to[/tranquility/beams/overlord/crud_historic_toolbars/data]: {"latestTime":"2014-10-10T14:10:00.000Z","latestCloseTime":"2014-10-10T15:04:00.000+01:00","beams":{"2014-10-10T15:09:00.000+01:00":[{"timestamp":"2014-10-10T14:09:00.000Z","partition":0,"tasks":[{"id":"index_realtime_crud_historic_toolbars_2014-10-10T14:09:00.000Z_0_0_pfcmaagj","firehoseId":"crud_historic_toolbars-09-0000-0000"}]}],"2014-10-10T14:10:00.000Z":[{"timestamp":"2014-10-10T14:10:00.000Z","partition":0,"tasks":[{"id":"index_realtime_crud_historic_toolbars_2014-10-10T14:10:00.000Z_0_0_lhnnedij","firehoseId":"crud_historic_toolbars-10-0000-0000"}]}]}}
221787 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.tranquility.beam.ClusteredBeam - Adding beams for identifier[overlord/crud_historic_toolbars] timestamp[2014-10-10T14:10:00.000Z]: List(Map(timestamp -> 2014-10-10T14:10:00.000Z, partition -> 0, tasks -> Vector(Map(id -> index_realtime_crud_historic_toolbars_2014-10-10T14:10:00.000Z_0_0_lhnnedij, firehoseId -> crud_historic_toolbars-10-0000-0000))))
221823 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.common.scala.net.finagle.DiscoResolver - Updating instances for service[druid:local:firehose:crud_historic_toolbars-10-0000-0000] to Set()
221828 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] WARN  finagle - Name resolution is pending
221841 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.tranquility.finagle.FinagleRegistry - Created client for service: druid:local:firehose:crud_historic_toolbars-10-0000-0000
221848 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.tranquility.beam.ClusteredBeam - Removing beams for identifier[overlord/crud_historic_toolbars] timestamp[2014-10-10T14:09:00.000Z]
221853 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.tranquility.druid.DruidBeam - Closing Druid beam for datasource[crud_historic_toolbars] timestamp[2014-10-10T15:09:00.000+01:00] (tasks = index_realtime_crud_historic_toolbars_2014-10-10T14:09:00.000Z_0_0_pfcmaagj)
221854 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.tranquility.finagle.FinagleRegistry - Closing client for service: druid:local:firehose:crud_historic_toolbars-09-0000-0000
221862 [ClusteredBeam-ZkFuturePool-ab400c2e-95ef-44c8-b70b-a91dc748e920] INFO  com.metamx.common.scala.net.finagle.DiscoResolver - No longer monitoring service[druid:local:firehose:crud_historic_toolbars-09-0000-0000]
228365 [ServiceCache-0] INFO  com.metamx.common.scala.net.finagle.DiscoResolver - Updating instances for service[druid:local:firehose:crud_historic_toolbars-10-0000-0000] to Set(ServiceInstance{name='druid:local:firehose:crud_historic_toolbars-10-0000-0000', id='0c9fade3-987b-4d15-a039-841005d47734', address='druid-historic01-dev', port=8090, sslPort=null, payload=null, registrationTimeUTC=1412950215554, serviceType=DYNAMIC, uriSpec=null})
228454 [New I/O worker #1] WARN  com.metamx.tranquility.finagle.FutureRetry$ - Transient error, will try again in 129 ms
java
.io.IOException: Unable to push events to task: index_realtime_crud_historic_toolbars_2014-10-10T14:10:00.000Z_0_0_lhnnedij (status = TaskRunning)

 at com
.metamx.tranquility.druid.DruidBeam$$anonfun$4$$anonfun$apply$4$$anonfun$apply$6$$anonfun$apply$7$$anonfun$apply$3$$anonfun$applyOrElse$2.apply(DruidBeam.scala:149) ~[tranquility_2.10-0.2.16.jar:0.2.16]
 at com
.metamx.tranquility.druid.DruidBeam$$anonfun$4$$anonfun$apply$4$$anonfun$apply$6$$anonfun$apply$7$$anonfun$apply$3$$anonfun$applyOrElse$2.apply(DruidBeam.scala:135) ~[tranquility_2.10-0.2.16.jar:0.2.16]

 at com
.twitter.util.Future$$anonfun$map$1$$anonfun$apply$4.apply(Future.scala:821) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Try$.apply(Try.scala:13) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$.apply(Future.scala:82) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$$anonfun$map$1.apply(Future.scala:821) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$$anonfun$map$1.apply(Future.scala:821) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$$anonfun$flatMap$1.apply(Future.scala:784) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$$anonfun$flatMap$1.apply(Future.scala:783) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$Transformer.liftedTree1$1(Promise.scala:93) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$Transformer.k(Promise.scala:93) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$Transformer.apply(Promise.scala:102) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$Transformer.apply(Promise.scala:84) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$$anon$2.run(Promise.scala:324) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.LocalScheduler$Activation.run(Scheduler.scala:184) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.LocalScheduler$Activation.submit(Scheduler.scala:155) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.LocalScheduler.submit(Scheduler.scala:210) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.Scheduler$.submit(Scheduler.scala:84) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise.runq(Promise.scala:310) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise.updateIfEmpty(Promise.scala:605) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise.update(Promise.scala:583) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise.setValue(Promise.scala:559) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.AsyncQueue.offer(AsyncQueue.scala:76) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.finagle.transport.ChannelTransport.handleUpstream(ChannelTransport.scala:45) [finagle-core_2.10-6.18.0.jar:6.18.0]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.handler.codec.http.HttpContentDecoder.messageReceived(HttpContentDecoder.java:108) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.handler.codec.http.HttpChunkAggregator.messageReceived(HttpChunkAggregator.java:194) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:459) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.handler.codec.replay.ReplayingDecoder.callDecode(ReplayingDecoder.java:536) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.handler.codec.replay.ReplayingDecoder.messageReceived(ReplayingDecoder.java:435) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.handler.codec.http.HttpClientCodec.handleUpstream(HttpClientCodec.java:92) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.SimpleChannelHandler.messageReceived(SimpleChannelHandler.java:142) [netty-3.8.0.Final.jar:na]
 at com
.twitter.finagle.channel.ChannelStatsHandler.messageReceived(ChannelStatsHandler.scala:81) [finagle-core_2.10-6.18.0.jar:6.18.0]
 at org
.jboss.netty.channel.SimpleChannelHandler.handleUpstream(SimpleChannelHandler.java:88) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.SimpleChannelHandler.messageReceived(SimpleChannelHandler.java:142) [netty-3.8.0.Final.jar:na]
 at com
.twitter.finagle.channel.ChannelRequestStatsHandler.messageReceived(ChannelRequestStatsHandler.scala:35) [finagle-core_2.10-6.18.0.jar:6.18.0]
 at org
.jboss.netty.channel.SimpleChannelHandler.handleUpstream(SimpleChannelHandler.java:88) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:318) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) [netty-3.8.0.Final.jar:na]
 at java
.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [na:1.7.0_65]
 at java
.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [na:1.7.0_65]

 at java
.lang.Thread.run(Thread.java:745) [na:1.7.0_65]
Caused by: java.io.IOException: Failed to propagate 1 events for 2014-10-10T15:10:00.000+01:00: 302 Found
 at com
.metamx.tranquility.druid.DruidBeam$$anonfun$4$$anonfun$apply$4$$anonfun$apply$6$$anonfun$apply$7$$anonfun$apply$8.apply(DruidBeam.scala:123) ~[tranquility_2.10-0.2.16.jar:0.2.16]
 at com
.metamx.tranquility.druid.DruidBeam$$anonfun$4$$anonfun$apply$4$$anonfun$apply$6$$anonfun$apply$7$$anonfun$apply$8.apply(DruidBeam.scala:110) ~[tranquility_2.10-0.2.16.jar:0.2.16]
 at com
.twitter.util.Future$$anonfun$map$1$$anonfun$apply$4.apply(Future.scala:821) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Try$.apply(Try.scala:13) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$.apply(Future.scala:82) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$$anonfun$map$1.apply(Future.scala:821) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$$anonfun$map$1.apply(Future.scala:821) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$$anonfun$flatMap$1.apply(Future.scala:784) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$$anonfun$flatMap$1.apply(Future.scala:783) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$Transformer.liftedTree1$1(Promise.scala:93) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$Transformer.k(Promise.scala:93) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$Transformer.apply(Promise.scala:102) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$Transformer.apply(Promise.scala:84) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$$anon$2.run(Promise.scala:324) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.LocalScheduler$Activation.run(Scheduler.scala:184) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.LocalScheduler$Activation.submit(Scheduler.scala:155) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.LocalScheduler.submit(Scheduler.scala:210) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.Scheduler$.submit(Scheduler.scala:84) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise.runq(Promise.scala:310) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise.updateIfEmpty(Promise.scala:605) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise.update(Promise.scala:583) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise.setValue(Promise.scala:559) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.AsyncQueue.offer(AsyncQueue.scala:76) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.finagle.transport.ChannelTransport.handleUpstream(ChannelTransport.scala:45) [finagle-core_2.10-6.18.0.jar:6.18.0]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.handler.codec.http.HttpContentDecoder.messageReceived(HttpContentDecoder.java:108) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.handler.codec.http.HttpChunkAggregator.messageReceived(HttpChunkAggregator.java:145) [netty-3.8.0.Final.jar:na]
 
... 33 common frames omitted
228615 [New I/O worker #1] WARN  com.metamx.tranquility.finagle.FutureRetry$ - Transient error, will try again in 274 ms
java
.io.IOException: Unable to push events to task: index_realtime_crud_historic_toolbars_2014-10-10T14:10:00.000Z_0_0_lhnnedij (status = TaskRunning)

 at com
.metamx.tranquility.druid.DruidBeam$$anonfun$4$$anonfun$apply$4$$anonfun$apply$6$$anonfun$apply$7$$anonfun$apply$3$$anonfun$applyOrElse$2.apply(DruidBeam.scala:149) ~[tranquility_2.10-0.2.16.jar:0.2.16]
 at com
.metamx.tranquility.druid.DruidBeam$$anonfun$4$$anonfun$apply$4$$anonfun$apply$6$$anonfun$apply$7$$anonfun$apply$3$$anonfun$applyOrElse$2.apply(DruidBeam.scala:135) ~[tranquility_2.10-0.2.16.jar:0.2.16]

 at com
.twitter.util.Future$$anonfun$map$1$$anonfun$apply$4.apply(Future.scala:821) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Try$.apply(Try.scala:13) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$.apply(Future.scala:82) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$$anonfun$map$1.apply(Future.scala:821) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$$anonfun$map$1.apply(Future.scala:821) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$$anonfun$flatMap$1.apply(Future.scala:784) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$$anonfun$flatMap$1.apply(Future.scala:783) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$Transformer.liftedTree1$1(Promise.scala:93) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$Transformer.k(Promise.scala:93) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$Transformer.apply(Promise.scala:102) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$Transformer.apply(Promise.scala:84) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$$anon$2.run(Promise.scala:324) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.LocalScheduler$Activation.run(Scheduler.scala:184) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.LocalScheduler$Activation.submit(Scheduler.scala:155) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.LocalScheduler.submit(Scheduler.scala:210) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.Scheduler$.submit(Scheduler.scala:84) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise.runq(Promise.scala:310) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise.updateIfEmpty(Promise.scala:605) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise.update(Promise.scala:583) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise.setValue(Promise.scala:559) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.AsyncQueue.offer(AsyncQueue.scala:76) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.finagle.transport.ChannelTransport.handleUpstream(ChannelTransport.scala:45) [finagle-core_2.10-6.18.0.jar:6.18.0]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.handler.codec.http.HttpContentDecoder.messageReceived(HttpContentDecoder.java:108) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.handler.codec.http.HttpChunkAggregator.messageReceived(HttpChunkAggregator.java:194) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:459) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.handler.codec.replay.ReplayingDecoder.callDecode(ReplayingDecoder.java:536) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.handler.codec.replay.ReplayingDecoder.messageReceived(ReplayingDecoder.java:435) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.handler.codec.http.HttpClientCodec.handleUpstream(HttpClientCodec.java:92) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.SimpleChannelHandler.messageReceived(SimpleChannelHandler.java:142) [netty-3.8.0.Final.jar:na]
 at com
.twitter.finagle.channel.ChannelStatsHandler.messageReceived(ChannelStatsHandler.scala:81) [finagle-core_2.10-6.18.0.jar:6.18.0]
 at org
.jboss.netty.channel.SimpleChannelHandler.handleUpstream(SimpleChannelHandler.java:88) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.SimpleChannelHandler.messageReceived(SimpleChannelHandler.java:142) [netty-3.8.0.Final.jar:na]
 at com
.twitter.finagle.channel.ChannelRequestStatsHandler.messageReceived(ChannelRequestStatsHandler.scala:35) [finagle-core_2.10-6.18.0.jar:6.18.0]
 at org
.jboss.netty.channel.SimpleChannelHandler.handleUpstream(SimpleChannelHandler.java:88) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:318) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) [netty-3.8.0.Final.jar:na]
 at java
.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [na:1.7.0_65]
 at java
.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [na:1.7.0_65]

 at java
.lang.Thread.run(Thread.java:745) [na:1.7.0_65]
Caused by: java.io.IOException: Failed to propagate 1 events for 2014-10-10T15:10:00.000+01:00: 302 Found
 at com
.metamx.tranquility.druid.DruidBeam$$anonfun$4$$anonfun$apply$4$$anonfun$apply$6$$anonfun$apply$7$$anonfun$apply$8.apply(DruidBeam.scala:123) ~[tranquility_2.10-0.2.16.jar:0.2.16]
 at com
.metamx.tranquility.druid.DruidBeam$$anonfun$4$$anonfun$apply$4$$anonfun$apply$6$$anonfun$apply$7$$anonfun$apply$8.apply(DruidBeam.scala:110) ~[tranquility_2.10-0.2.16.jar:0.2.16]
 at com
.twitter.util.Future$$anonfun$map$1$$anonfun$apply$4.apply(Future.scala:821) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Try$.apply(Try.scala:13) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$.apply(Future.scala:82) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$$anonfun$map$1.apply(Future.scala:821) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$$anonfun$map$1.apply(Future.scala:821) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$$anonfun$flatMap$1.apply(Future.scala:784) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$$anonfun$flatMap$1.apply(Future.scala:783) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$Transformer.liftedTree1$1(Promise.scala:93) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$Transformer.k(Promise.scala:93) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$Transformer.apply(Promise.scala:102) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$Transformer.apply(Promise.scala:84) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$$anon$2.run(Promise.scala:324) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.LocalScheduler$Activation.run(Scheduler.scala:184) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.LocalScheduler$Activation.submit(Scheduler.scala:155) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.LocalScheduler.submit(Scheduler.scala:210) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.Scheduler$.submit(Scheduler.scala:84) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise.runq(Promise.scala:310) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise.updateIfEmpty(Promise.scala:605) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise.update(Promise.scala:583) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise.setValue(Promise.scala:559) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.AsyncQueue.offer(AsyncQueue.scala:76) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.finagle.transport.ChannelTransport.handleUpstream(ChannelTransport.scala:45) [finagle-core_2.10-6.18.0.jar:6.18.0]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.handler.codec.http.HttpContentDecoder.messageReceived(HttpContentDecoder.java:108) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.handler.codec.http.HttpChunkAggregator.messageReceived(HttpChunkAggregator.java:145) [netty-3.8.0.Final.jar:na]
 
... 33 common frames omitted
284888 [New I/O worker #4] WARN  com.metamx.tranquility.beam.ClusteredBeam - Emitting alert: [anomaly] Failed to propagate events: overlord/crud_historic_toolbars
{
 
"eventCount" : 1,
 
"timestamp" : "2014-10-10T14:10:00.000Z",
 
"beams" : "HashPartitionBeam(DruidBeam(timestamp = 2014-10-10T15:10:00.000+01:00, partition = 0, tasks = [index_realtime_crud_historic_toolbars_2014-10-10T14:10:00.000Z_0_0_lhnnedij/crud_historic_toolbars-10-0000-0000]))"
}
java
.io.IOException: Failed to propagate 1 events for 2014-10-10T15:10:00.000+01:00: 302 Found
 at com
.metamx.tranquility.druid.DruidBeam$$anonfun$4$$anonfun$apply$4$$anonfun$apply$6$$anonfun$apply$7$$anonfun$apply$8.apply(DruidBeam.scala:123) ~[tranquility_2.10-0.2.16.jar:0.2.16]
 at com
.metamx.tranquility.druid.DruidBeam$$anonfun$4$$anonfun$apply$4$$anonfun$apply$6$$anonfun$apply$7$$anonfun$apply$8.apply(DruidBeam.scala:110) ~[tranquility_2.10-0.2.16.jar:0.2.16]
 at com
.twitter.util.Future$$anonfun$map$1$$anonfun$apply$4.apply(Future.scala:821) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Try$.apply(Try.scala:13) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$.apply(Future.scala:82) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$$anonfun$map$1.apply(Future.scala:821) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$$anonfun$map$1.apply(Future.scala:821) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$$anonfun$flatMap$1.apply(Future.scala:784) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Future$$anonfun$flatMap$1.apply(Future.scala:783) ~[util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$Transformer.liftedTree1$1(Promise.scala:93) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$Transformer.k(Promise.scala:93) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$Transformer.apply(Promise.scala:102) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$Transformer.apply(Promise.scala:84) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise$$anon$2.run(Promise.scala:324) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.LocalScheduler$Activation.run(Scheduler.scala:184) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.LocalScheduler$Activation.submit(Scheduler.scala:155) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.LocalScheduler.submit(Scheduler.scala:210) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.Scheduler$.submit(Scheduler.scala:84) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise.runq(Promise.scala:310) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise.updateIfEmpty(Promise.scala:605) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise.update(Promise.scala:583) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.util.Promise.setValue(Promise.scala:559) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.concurrent.AsyncQueue.offer(AsyncQueue.scala:76) [util-core_2.10-6.18.0.jar:6.18.0]
 at com
.twitter.finagle.transport.ChannelTransport.handleUpstream(ChannelTransport.scala:45) [finagle-core_2.10-6.18.0.jar:6.18.0]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.handler.codec.http.HttpContentDecoder.messageReceived(HttpContentDecoder.java:108) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.handler.codec.http.HttpChunkAggregator.messageReceived(HttpChunkAggregator.java:145) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:459) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.handler.codec.replay.ReplayingDecoder.callDecode(ReplayingDecoder.java:536) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.handler.codec.replay.ReplayingDecoder.messageReceived(ReplayingDecoder.java:435) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.handler.codec.http.HttpClientCodec.handleUpstream(HttpClientCodec.java:92) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.SimpleChannelHandler.messageReceived(SimpleChannelHandler.java:142) [netty-3.8.0.Final.jar:na]
 at com
.twitter.finagle.channel.ChannelStatsHandler.messageReceived(ChannelStatsHandler.scala:81) [finagle-core_2.10-6.18.0.jar:6.18.0]
 at org
.jboss.netty.channel.SimpleChannelHandler.handleUpstream(SimpleChannelHandler.java:88) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.SimpleChannelHandler.messageReceived(SimpleChannelHandler.java:142) [netty-3.8.0.Final.jar:na]
 at com
.twitter.finagle.channel.ChannelRequestStatsHandler.messageReceived(ChannelRequestStatsHandler.scala:35) [finagle-core_2.10-6.18.0.jar:6.18.0]
 at org
.jboss.netty.channel.SimpleChannelHandler.handleUpstream(SimpleChannelHandler.java:88) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:318) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) [netty-3.8.0.Final.jar:na]
 at org
.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) [netty-3.8.0.Final.jar:na]
 at java
.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [na:1.7.0_65]
 at java
.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [na:1.7.0_65]

 at java
.lang.Thread.run(Thread.java:745) [na:1.7.0_65]
284996 [New I/O worker #4] INFO  com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"alerts","timestamp":"2014-10-10T15:11:14.695+01:00","service":"tranquility","host":"localhost","severity":"anomaly","description":"Failed to propagate events: overlord/crud_historic_toolbars","data":{"exceptionType":"java.io.IOException","exceptionStackTrace":"java.io.IOException: Failed to propagate 1 events for 2014-10-10T15:10:00.000+01:00: 302 Found\n\tat com.metamx.tranquility.druid.DruidBeam$$anonfun$4$$anonfun$apply$4$$anonfun$apply$6$$anonfun$apply$7$$anonfun$apply$8.apply(DruidBeam.scala:123)\n\tat com.metamx.tranquility.druid.DruidBeam$$anonfun$4$$anonfun$apply$4$$anonfun$apply$6$$anonfun$apply$7$$anonfun$apply$8.apply(DruidBeam.scala:110)\n\tat com.twitter.util.Future$$anonfun$map$1$$anonfun$apply$4.apply(Future.scala:821)\n\tat com.twitter.util.Try$.apply(Try.scala:13)\n\tat com.twitter.util.Future$.apply(Future.scala:82)\n\tat com.twitter.util.Future$$anonfun$map$1.apply(Future.scala:821)\n\tat com.twitter.util.Future$$anonfun$map$1.apply(Future.scala:821)\n\tat com.twitter.util.Future$$anonfun$flatMap$1.apply(Future.scala:784)\n\tat com.twitter.util.Future$$anonfun$flatMap$1.apply(Future.scala:783)\n\tat com.twitter.util.Promise$Transformer.liftedTree1$1(Promise.scala:93)\n\tat com.twitter.util.Promise$Transformer.k(Promise.scala:93)\n\tat com.twitter.util.Promise$Transformer.apply(Promise.scala:102)\n\tat com.twitter.util.Promise$Transformer.apply(Promise.scala:84)\n\tat com.twitter.util.Promise$$anon$2.run(Promise.scala:324)\n\tat com.twitter.concurrent.LocalScheduler$Activation.run(Scheduler.scala:184)\n\tat com.twitter.concurrent.LocalScheduler$Activation.submit(Scheduler.scala:155)\n\tat com.twitter.concurrent.LocalScheduler.submit(Scheduler.scala:210)\n\tat com.twitter.concurrent.Scheduler$.submit(Scheduler.scala:84)\n\tat com.twitter.util.Promise.runq(Promise.scala:310)\n\tat com.twitter.util.Promise.updateIfEmpty(Promise.scala:605)\n\tat com.twitter.util.Promise.update(Promise.scala:583)\n\tat com.twitter.util.Promise.setValue(Promise.scala:559)\n\tat com.twitter.concurrent.AsyncQueue.offer(AsyncQueue.scala:76)\n\tat com.twitter.finagle.transport.ChannelTransport.handleUpstream(ChannelTransport.scala:45)\n\tat org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)\n\tat org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)\n\tat org.jboss.netty.handler.codec.http.HttpContentDecoder.messageReceived(HttpContentDecoder.java:108)\n\tat org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)\n\tat org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)\n\tat org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)\n\tat org.jboss.netty.handler.codec.http.HttpChunkAggregator.messageReceived(HttpChunkAggregator.java:145)\n\tat org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)\n\tat org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)\n\tat org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)\n\tat org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296)\n\tat org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:459)\n\tat org.jboss.netty.handler.codec.replay.ReplayingDecoder.callDecode(ReplayingDecoder.java:536)\n\tat org.jboss.netty.handler.codec.replay.ReplayingDecoder.messageReceived(ReplayingDecoder.java:435)\n\tat org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)\n\tat org.jboss.netty.handler.codec.http.HttpClientCodec.handleUpstream(HttpClientCodec.java:92)\n\tat org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)\n\tat org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)\n\tat org.jboss.netty.channel.SimpleChannelHandler.messageReceived(SimpleChannelHandler.java:142)\n\tat com.twitter.finagle.channel.ChannelStatsHandler.messageReceived(ChannelStatsHandler.scala:81)\n\tat org.jboss.netty.channel.SimpleChannelHandler.handleUpstream(SimpleChannelHandler.java:88)\n\tat org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)\n\tat org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)\n\tat org.jboss.netty.channel.SimpleChannelHandler.messageReceived(SimpleChannelHandler.java:142)\n\tat com.twitter.finagle.channel.ChannelRequestStatsHandler.messageReceived(ChannelRequestStatsHandler.scala:35)\n\tat org.jboss.netty.channel.SimpleChannelHandler.handleUpstream(SimpleChannelHandler.java:88)\n\tat org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)\n\tat org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559)\n\tat org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268)\n\tat org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255)\n\tat org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88)\n\tat org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108)\n\tat org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:318)\n\tat org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89)\n\tat org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)\n\tat org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)\n\tat org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)\n\tat java.lang.Thread.run(Thread.java:745)\n","timestamp":"2014-10-10T14:10:00.000Z","beams":"HashPartitionBeam(DruidBeam(timestamp = 2014-10-10T15:10:00.000+01:00, partition = 0, tasks = [index_realtime_crud_historic_toolbars_2014-10-10T14:10:00.000Z_0_0_lhnnedij/crud_historic_toolbars-10-0000-0000]))","eventCount":1,"exceptionMessage":"Failed to propagate 1 events for 2014-10-10T15:10:00.000+01:00: 302 Found"}}]
284997 [BeamBolt-Emitter-tranquility-0] INFO  com.metamx.tranquility.storm.BeamBolt - Sent 0, ignored 1 queued events.
293960 [Thread-10-kinesis_spout] INFO  com.amazonaws.services.kinesis.stormspout.stormspout.state.zookeeper.ZookeeperStateManager - ZookeeperStateManager[taskIndex=0]Advanced checkpoint for shardId-000000000000 to 49542843645697196133996540787206229642423409602577039361



When this task was finished, I sent new event, and it goes successfully to Druid.

mar.s...@tuguu.com

unread,
Oct 10, 2014, 11:40:44 AM10/10/14
to druid-de...@googlegroups.com

To clarify my question, posting additional configs:

jvm args:
/usr/bin/java -Xmx2g -Duser.timezone=UTC -Dfile.encoding=UTF-8 -classpath /opt/druid/druid-services/lib/*:/opt/druid/etc/overlord io.druid.cli.Main server overlord

/opt/druid/etc/overlord/runtime.properties
druid.indexer.task.storage.type=db
druid.indexer.task.chathandler.type=announce
druid.worker.capacity=20


Gian Merlino

unread,
Oct 12, 2014, 1:42:40 PM10/12/14
to druid-de...@googlegroups.com
Hmm, the "302 Found" response from the task is strange. Do you have logs from the task "index_realtime_crud_historic_toolbars_2014-10-10T14:10:00.000Z_0_0_lhnnedij"? They might potentially be interesting.
Message has been deleted

mar.s...@tuguu.com

unread,
Oct 16, 2014, 7:02:02 AM10/16/14
to druid-de...@googlegroups.com
Hi Gian!

I couldn't find the Task's Log you asked for, now I want to repeat this error and show you the current Task Log, with the same error:

Failed to propagate 189 events for 2014-10-16T11:05:00.000+01:00: 302 Found

7876143 [SyncThread:0] WARN org.apache.zookeeper.server.persistence.FileTxnLog - fsync-ing the write ahead log in SyncThread:0 took 3530ms which will adversely effect operation latency. See the ZooKeeper troubleshooting guide
7876285 [New I/O worker #3] WARN com.metamx.tranquility.finagle.FutureRetry$ - Transient error, will try again in 30406 ms
java.io.IOException: Unable to push events to task: index_realtime_crud_historic_toolbars_2014-10-16T10:05:00.000Z_0_0_ehglngel (status = TaskRunning)
Caused by: java.io.IOException: Failed to propagate 189 events for 2014-10-16T11:05:00.000+01:00: 302 Found
index_realtime_crud_historic_toolbars_2014-10-16T10:05:00.000Z_0_0_ehglngel
2014-10-16 10:06:01,224 INFO [main] io.druid.guice.PropertiesModule - Loading properties from runtime.properties
2014-10-16 10:06:01,251 INFO [main] org.hibernate.validator.internal.util.Version - HV000001: Hibernate Validator 5.0.1.Final
2014-10-16 10:06:01,690 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.guice.ExtensionsConfig] from props[druid.extensions.] as [ExtensionsConfig{searchCurrentClassloader=true, coordinates=[io.druid.extensions:druid-s3-extensions:0.6.146], localRepository='/root/.m2/repository', remoteRepositories=[http://repo1.maven.org/maven2/, https://metamx.artifactoryonline.com/metamx/pub-libs-releases-local]}]
2014-10-16 10:06:01,800 INFO [main] io.druid.initialization.Initialization - Loading extension[io.druid.extensions:druid-s3-extensions:0.6.146] for class[io.druid.cli.CliCommandCreator]
2014-10-16 10:06:02,298 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/io/druid/extensions/druid-s3-extensions/0.6.146/druid-s3-extensions-0.6.146.jar]
2014-10-16 10:06:02,298 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/net/java/dev/jets3t/jets3t/0.9.1/jets3t-0.9.1.jar]
2014-10-16 10:06:02,298 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/commons-codec/commons-codec/1.7/commons-codec-1.7.jar]
2014-10-16 10:06:02,298 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar]
2014-10-16 10:06:02,298 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar]
2014-10-16 10:06:02,298 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/javax/activation/activation/1.1.1/activation-1.1.1.jar]
2014-10-16 10:06:02,298 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/mx4j/mx4j/3.0.2/mx4j-3.0.2.jar]
2014-10-16 10:06:02,298 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/javax/mail/mail/1.4.7/mail-1.4.7.jar]
2014-10-16 10:06:02,298 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar]
2014-10-16 10:06:02,299 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/org/bouncycastle/bcprov-jdk15/1.46/bcprov-jdk15-1.46.jar]
2014-10-16 10:06:02,299 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/com/jamesmurty/utils/java-xmlbuilder/0.4/java-xmlbuilder-0.4.jar]
2014-10-16 10:06:02,299 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/com/amazonaws/aws-java-sdk/1.6.0.1/aws-java-sdk-1.6.0.1.jar]
2014-10-16 10:06:02,299 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.2.3/jackson-core-2.2.3.jar]
2014-10-16 10:06:02,302 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.2.3/jackson-annotations-2.2.3.jar]
2014-10-16 10:06:02,302 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/org/apache/httpcomponents/httpclient/4.2/httpclient-4.2.jar]
2014-10-16 10:06:02,302 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/org/apache/httpcomponents/httpcore/4.2/httpcore-4.2.jar]
2014-10-16 10:06:02,302 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/com/metamx/emitter/0.2.11/emitter-0.2.11.jar]
2014-10-16 10:06:02,302 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/com/metamx/http-client/0.9.6/http-client-0.9.6.jar]
2014-10-16 10:06:02,303 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/io/netty/netty/3.9.0.Final/netty-3.9.0.Final.jar]
2014-10-16 10:06:02,303 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/log4j/log4j/1.2.16/log4j-1.2.16.jar]
2014-10-16 10:06:02,303 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/com/google/guava/guava/16.0.1/guava-16.0.1.jar]
2014-10-16 10:06:02,303 INFO [main] io.druid.initialization.Initialization - Added URL[file:/root/.m2/repository/commons-io/commons-io/2.0.1/commons-io-2.0.1.jar]
2014-10-16 10:06:02,479 INFO [main] io.druid.initialization.Initialization - Loading extension[io.druid.extensions:druid-s3-extensions:0.6.146] for class[io.druid.initialization.DruidModule]
2014-10-16 10:06:02,583 INFO [main] io.druid.initialization.Initialization - Adding extension module[class io.druid.storage.s3.S3StorageDruidModule] for class[io.druid.initialization.DruidModule]
2014-10-16 10:06:02,583 INFO [main] io.druid.initialization.Initialization - Adding extension module[class io.druid.firehose.s3.S3FirehoseDruidModule] for class[io.druid.initialization.DruidModule]
2014-10-16 10:06:03,000 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class com.metamx.emitter.core.LoggingEmitterConfig] from props[druid.emitter.logging.] as [LoggingEmitterConfig{loggerClass='com.metamx.emitter.core.LoggingEmitter', logLevel='info'}]
2014-10-16 10:06:03,060 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.metrics.DruidMonitorSchedulerConfig] from props[druid.monitoring.] as [io.druid.server.metrics.DruidMonitorSchedulerConfig@13fea274]
2014-10-16 10:06:03,074 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.metrics.MonitorsConfig] from props[druid.monitoring.] as [MonitorsConfig{monitors=[]}]
2014-10-16 10:06:03,091 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.DruidNode] from props[druid.] as [DruidNode{serviceName='overlord', host='druid-historic01-dev:8090', port=8090}]
2014-10-16 10:06:03,147 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.ServerConfig] from props[druid.server.http.] as [io.druid.server.initialization.ServerConfig@25c97c75]
2014-10-16 10:06:03,157 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.indexing.common.config.TaskConfig] from props[druid.indexer.task.] as [io.druid.indexing.common.config.TaskConfig@237098e7]
2014-10-16 10:06:03,162 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.guice.http.DruidHttpClientConfig] from props[druid.global.http.] as [io.druid.guice.http.DruidHttpClientConfig@4014308]
2014-10-16 10:06:03,254 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.client.indexing.IndexingServiceSelectorConfig] from props[druid.selectors.indexing.] as [io.druid.client.indexing.IndexingServiceSelectorConfig@6398523f]
2014-10-16 10:06:03,256 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [kafka01-dev:2181] for [druid.zk.service.host] on [io.druid.curator.CuratorConfig#getZkHosts()]
2014-10-16 10:06:03,258 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning default value [30000] for [druid.zk.service.sessionTimeoutMs] on [io.druid.curator.CuratorConfig#getZkSessionTimeoutMs()]
2014-10-16 10:06:03,258 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning default value [false] for [druid.curator.compress] on [io.druid.curator.CuratorConfig#enableCompression()]
2014-10-16 10:06:03,315 WARN [main] org.apache.curator.retry.ExponentialBackoffRetry - maxRetries too large (30). Pinning to 29
2014-10-16 10:06:03,335 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.CuratorDiscoveryConfig] from props[druid.discovery.curator.] as [io.druid.server.initialization.CuratorDiscoveryConfig@c50e3bf]
2014-10-16 10:06:03,450 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.indexing.common.RetryPolicyConfig] from props[druid.peon.taskActionClient.retry.] as [io.druid.indexing.common.RetryPolicyConfig@70307b0e]
2014-10-16 10:06:03,454 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.storage.s3.AWSCredentialsConfig] from props[druid.s3.] as [io.druid.storage.s3.AWSCredentialsConfig@7775969d]
2014-10-16 10:06:03,537 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.storage.s3.S3DataSegmentPusherConfig] from props[druid.storage.] as [io.druid.storage.s3.S3DataSegmentPusherConfig@3f412250]
2014-10-16 10:06:03,540 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.storage.s3.S3DataSegmentArchiverConfig] from props[druid.storage.] as [io.druid.storage.s3.S3DataSegmentArchiverConfig@c79b44d]
2014-10-16 10:06:03,553 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.client.DruidServerConfig] from props[druid.server.] as [io.druid.client.DruidServerConfig@6219992e]
2014-10-16 10:06:03,554 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.base] on [io.druid.server.initialization.ZkPathsConfig#getZkBasePath()]
2014-10-16 10:06:03,554 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.liveSegmentsPath] on [io.druid.server.initialization.ZkPathsConfig#getLiveSegmentsPath()]
2014-10-16 10:06:03,554 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.announcementsPath] on [io.druid.server.initialization.ZkPathsConfig#getAnnouncementsPath()]
2014-10-16 10:06:03,554 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.servedSegmentsPath] on [io.druid.server.initialization.ZkPathsConfig#getServedSegmentsPath()]
2014-10-16 10:06:03,554 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.loadQueuePath] on [io.druid.server.initialization.ZkPathsConfig#getLoadQueuePath()]
2014-10-16 10:06:03,554 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.coordinatorPath] on [io.druid.server.initialization.ZkPathsConfig#getCoordinatorPath()]
2014-10-16 10:06:03,554 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.connectorPath] on [io.druid.server.initialization.ZkPathsConfig#getConnectorPath()]
2014-10-16 10:06:03,555 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.propertiesPath] on [io.druid.server.initialization.ZkPathsConfig#getPropertiesPath()]
2014-10-16 10:06:03,555 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.indexer.announcementsPath] on [io.druid.server.initialization.ZkPathsConfig#getIndexerAnnouncementPath()]
2014-10-16 10:06:03,555 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.indexer.tasksPath] on [io.druid.server.initialization.ZkPathsConfig#getIndexerTaskPath()]
2014-10-16 10:06:03,555 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.indexer.statusPath] on [io.druid.server.initialization.ZkPathsConfig#getIndexerStatusPath()]
2014-10-16 10:06:03,555 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.zk.paths.indexer.leaderLatchPath] on [io.druid.server.initialization.ZkPathsConfig#getIndexerLeaderLatchPath()]
2014-10-16 10:06:03,580 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.BatchDataSegmentAnnouncerConfig] from props[druid.announcer.] as [io.druid.server.initialization.BatchDataSegmentAnnouncerConfig@3c437143]
2014-10-16 10:06:03,584 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.server.coordination.DataSegmentAnnouncerProvider] from props[druid.announcer.] as [io.druid.server.coordination.LegacyDataSegmentAnnouncerProvider@399b2815]
2014-10-16 10:06:03,589 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.client.FilteredServerViewProvider] from props[druid.announcer.] as [io.druid.client.SingleServerInventoryProvider@69be39e6]
2014-10-16 10:06:03,597 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.QueryConfig] from props[druid.query.] as [io.druid.query.QueryConfig@6a39e61b]
2014-10-16 10:06:03,607 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.search.search.SearchQueryConfig] from props[druid.query.search.] as [io.druid.query.search.search.SearchQueryConfig@23efdada]
2014-10-16 10:06:03,613 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.groupby.GroupByQueryConfig] from props[druid.query.groupBy.] as [io.druid.query.groupby.GroupByQueryConfig@2d986369]
2014-10-16 10:06:03,614 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [268435456] for [druid.computation.buffer.size] on [io.druid.query.DruidProcessingConfig#intermediateComputeSizeBytes()]
2014-10-16 10:06:03,614 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [${base_path}.numThreads] on [io.druid.query.DruidProcessingConfig#getNumThreads()]
2014-10-16 10:06:03,614 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [${base_path}.columnCache.sizeBytes] on [io.druid.query.DruidProcessingConfig#columnCacheSizeBytes()]
2014-10-16 10:06:03,614 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning default value [processing-%s] for [${base_path}.formatString] on [com.metamx.common.concurrent.ExecutorServiceConfig#getFormatString()]
2014-10-16 10:06:03,629 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.topn.TopNQueryConfig] from props[druid.query.topN.] as [io.druid.query.topn.TopNQueryConfig@59c6b2e3]
2014-10-16 10:06:03,636 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.server.log.RequestLoggerProvider] from props[druid.request.logging.] as [io.druid.server.log.NoopRequestLoggerProvider@5972a80d]
2014-10-16 10:06:03,641 INFO [main] org.eclipse.jetty.util.log - Logging initialized @2792ms
2014-10-16 10:06:03,692 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.emitter.core.LoggingEmitter.start()] on object[com.metamx.emitter.core.LoggingEmitter@26467d78].
2014-10-16 10:06:03,692 INFO [main] com.metamx.emitter.core.LoggingEmitter - Start: started [true]
2014-10-16 10:06:03,692 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.emitter.service.ServiceEmitter.start()] on object[com.metamx.emitter.service.ServiceEmitter@1bc0c549].
2014-10-16 10:06:03,692 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.metrics.MonitorScheduler.start()] on object[com.metamx.metrics.MonitorScheduler@cd849db].
2014-10-16 10:06:03,694 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.http.client.HttpClient.start()] on object[com.metamx.http.client.HttpClient@3a71309f].
2014-10-16 10:06:03,694 INFO [main] io.druid.curator.CuratorModule - Starting Curator
2014-10-16 10:06:03,694 INFO [main] org.apache.curator.framework.imps.CuratorFrameworkImpl - Starting
2014-10-16 10:06:03,701 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
2014-10-16 10:06:03,701 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:host.name=localhost
2014-10-16 10:06:03,701 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.version=1.7.0_51
2014-10-16 10:06:03,701 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.vendor=Oracle Corporation
2014-10-16 10:06:03,701 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.home=/usr/lib/jvm/java-7-openjdk-amd64/jre
2014-10-16 10:06:03,701 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.class.path=/opt/druid/druid-services/lib/antlr4-runtime-4.0.jar:/opt/druid/druid-services/lib/alphanum-1.0.3.jar:/opt/druid/druid-services/lib/curator-framework-2.6.0.jar:/opt/druid/druid-services/lib/maven-aether-provider-3.1.1.jar:/opt/druid/druid-services/lib/jersey-server-1.17.1.jar:/opt/druid/druid-services/lib/maven-repository-metadata-3.1.1.jar:/opt/druid/druid-services/lib/druid-indexing-hadoop-0.6.146.jar:/opt/druid/druid-services/lib/jetty-proxy-9.2.2.v20140723.jar:/opt/druid/druid-services/lib/jetty-security-9.2.2.v20140723.jar:/opt/druid/druid-services/lib/jackson-datatype-joda-2.2.3.jar:/opt/druid/druid-services/lib/slf4j-log4j12-1.6.2.jar:/opt/druid/druid-services/lib/compress-lzf-0.8.4.jar:/opt/druid/druid-services/lib/server-metrics-0.0.9.jar:/opt/druid/druid-services/lib/http-client-0.9.6.jar:/opt/druid/druid-services/lib/mail-1.4.7.jar:/opt/druid/druid-services/lib/maven-model-3.1.1.jar:/opt/druid/druid-services/lib/aether-connector-file-0.9.0.M2.jar:/opt/druid/druid-services/lib/spymemcached-2.8.4.jar:/opt/druid/druid-services/lib/jetty-util-9.2.2.v20140723.jar:/opt/druid/druid-services/lib/javax.inject-1.jar:/opt/druid/druid-services/lib/maxminddb-0.2.0.jar:/opt/druid/druid-services/lib/jackson-module-jaxb-annotations-2.2.3.jar:/opt/druid/druid-services/lib/javax.servlet-api-3.1.0.jar:/opt/druid/druid-services/lib/mysql-connector-java-5.1.18.jar:/opt/druid/druid-services/lib/jetty-servlets-9.2.2.v20140723.jar:/opt/druid/druid-services/lib/javax.el-3.0.0.jar:/opt/druid/druid-services/lib/guice-multibindings-4.0-beta.jar:/opt/druid/druid-services/lib/jetty-server-9.2.2.v20140723.jar:/opt/druid/druid-services/lib/joda-time-2.1.jar:/opt/druid/druid-services/lib/httpclient-4.2.jar:/opt/druid/druid-services/lib/curator-client-2.6.0.jar:/opt/druid/druid-services/lib/okhttp-1.0.2.jar:/opt/druid/druid-services/lib/jsr305-2.0.1.jar:/opt/druid/druid-services/lib/slf4j-api-1.6.4.jar:/opt/druid/druid-services/lib/druid-services-0.6.146.jar:/opt/druid/druid-services/lib/jackson-jaxrs-json-provider-2.2.3.jar:/opt/druid/druid-services/lib/jackson-core-asl-1.9.13.jar:/opt/druid/druid-services/lib/jboss-logging-3.1.1.GA.jar:/opt/druid/druid-services/lib/aether-impl-0.9.0.M2.jar:/opt/druid/druid-services/lib/rhino-1.7R4.jar:/opt/druid/druid-services/lib/netty-3.9.0.Final.jar:/opt/druid/druid-services/lib/hibernate-validator-5.0.1.Final.jar:/opt/druid/druid-services/lib/jetty-servlet-9.2.2.v20140723.jar:/opt/druid/druid-services/lib/aws-java-sdk-1.6.0.1.jar:/opt/druid/druid-services/lib/commons-pool-1.6.jar:/opt/druid/druid-services/lib/jline-0.9.94.jar:/opt/druid/druid-services/lib/tesla-aether-0.0.5.jar:/opt/druid/druid-services/lib/activation-1.1.1.jar:/opt/druid/druid-services/lib/plexus-utils-3.0.15.jar:/opt/druid/druid-services/lib/commons-dbcp-1.4.jar:/opt/druid/druid-services/lib/jackson-core-2.2.3.jar:/opt/druid/druid-services/lib/jetty-http-9.2.2.v20140723.jar:/opt/druid/druid-services/lib/opencsv-2.3.jar:/opt/druid/druid-services/lib/curator-recipes-2.6.0.jar:/opt/druid/druid-services/lib/plexus-interpolation-1.19.jar:/opt/druid/druid-services/lib/protobuf-java-2.5.0.jar:/opt/druid/druid-services/lib/jackson-datatype-guava-2.2.3.jar:/opt/druid/druid-services/lib/lz4-1.1.2.jar:/opt/druid/druid-services/lib/airline-0.6.jar:/opt/druid/druid-services/lib/geoip2-0.4.0.jar:/opt/druid/druid-services/lib/java-xmlbuilder-0.4.jar:/opt/druid/druid-services/lib/aether-spi-0.9.0.M2.jar:/opt/druid/druid-services/lib/guice-4.0-beta.jar:/opt/druid/druid-services/lib/bytebuffer-collections-0.0.2.jar:/opt/druid/druid-services/lib/maven-model-builder-3.1.1.jar:/opt/druid/druid-services/lib/guice-servlet-4.0-beta.jar:/opt/druid/druid-services/lib/log4j-1.2.16.jar:/opt/druid/druid-services/lib/jdbi-2.32.jar:/opt/druid/druid-services/lib/validation-api-1.1.0.Final.jar:/opt/druid/druid-services/lib/jetty-continuation-9.2.2.v20140723.jar:/opt/druid/druid-services/lib/zookeeper-3.4.6.jar:/opt/druid/druid-services/lib/xpp3-1.1.4c.jar:/opt/druid/druid-services/lib/druid-indexing-service-0.6.146.jar:/opt/druid/druid-services/lib/jackson-dataformat-smile-2.2.3.jar:/opt/druid/druid-services/lib/icu4j-4.8.1.jar:/opt/druid/druid-services/lib/mx4j-3.0.2.jar:/opt/druid/druid-services/lib/config-magic-0.9.jar:/opt/druid/druid-services/lib/jackson-mapper-asl-1.9.13.jar:/opt/druid/druid-services/lib/druid-common-0.6.146.jar:/opt/druid/druid-services/lib/jackson-databind-2.2.3.jar:/opt/druid/druid-services/lib/commons-codec-1.7.jar:/opt/druid/druid-services/lib/sigar-1.6.5.132.jar:/opt/druid/druid-services/lib/httpcore-4.2.jar:/opt/druid/druid-services/lib/maven-settings-builder-3.1.1.jar:/opt/druid/druid-services/lib/commons-io-2.0.1.jar:/opt/druid/druid-services/lib/aether-connector-okhttp-0.0.9.jar:/opt/druid/druid-services/lib/aopalliance-1.0.jar:/opt/druid/druid-services/lib/druid-processing-0.6.146.jar:/opt/druid/druid-services/lib/druid-server-0.6.146.jar:/opt/druid/druid-services/lib/maven-settings-3.1.1.jar:/opt/druid/druid-services/lib/curator-x-discovery-2.6.0.jar:/opt/druid/druid-services/lib/guava-16.0.1.jar:/opt/druid/druid-services/lib/jersey-core-1.17.1.jar:/opt/druid/druid-services/lib/jackson-jaxrs-base-2.2.3.jar:/opt/druid/druid-services/lib/google-http-client-1.15.0-rc.jar:/opt/druid/druid-services/lib/org.abego.treelayout.core-1.0.1.jar:/opt/druid/druid-services/lib/jersey-servlet-1.17.1.jar:/opt/druid/druid-services/lib/commons-logging-1.1.1.jar:/opt/druid/druid-services/lib/java-util-0.26.6.jar:/opt/druid/druid-services/lib/bcprov-jdk15-1.46.jar:/opt/druid/druid-services/lib/commons-lang-2.6.jar:/opt/druid/druid-services/lib/aether-api-0.9.0.M2.jar:/opt/druid/druid-services/lib/wagon-provider-api-2.4.jar:/opt/druid/druid-services/lib/druid-api-0.2.7.jar:/opt/druid/druid-services/lib/jackson-annotations-2.2.3.jar:/opt/druid/druid-services/lib/commons-cli-1.2.jar:/opt/druid/druid-services/lib/emitter-0.2.11.jar:/opt/druid/druid-services/lib/aether-util-0.9.0.M2.jar:/opt/druid/druid-services/lib/jets3t-0.9.1.jar:/opt/druid/druid-services/lib/irc-api-1.0-0011.jar:/opt/druid/druid-services/lib/asm-3.1.jar:/opt/druid/druid-services/lib/jersey-guice-1.17.1.jar:/opt/druid/druid-services/lib/jetty-io-9.2.2.v20140723.jar:/opt/druid/druid-services/lib/classmate-0.8.0.jar:/opt/druid/druid-services/lib/jetty-client-9.2.2.v20140723.jar:/opt/druid/druid-services/lib/extendedset-1.3.4.jar:/opt/druid/druid-services/lib/google-http-client-jackson2-1.15.0-rc.jar:/opt/druid/etc/overlord
2014-10-16 10:06:03,701 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib/x86_64-linux-gnu/jni:/lib/x86_64-linux-gnu:/usr/lib/x86_64-linux-gnu:/usr/lib/jni:/lib:/usr/lib
2014-10-16 10:06:03,701 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.io.tmpdir=/tmp
2014-10-16 10:06:03,701 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.compiler=<NA>
2014-10-16 10:06:03,701 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:os.name=Linux
2014-10-16 10:06:03,701 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:os.arch=amd64
2014-10-16 10:06:03,701 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:os.version=3.2.0-60-generic
2014-10-16 10:06:03,701 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:user.name=root
2014-10-16 10:06:03,701 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:user.home=/root
2014-10-16 10:06:03,701 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:user.dir=/opt/druid/druid-services-0.6.146
2014-10-16 10:06:03,702 INFO [main] org.apache.zookeeper.ZooKeeper - Initiating client connection, connectString=kafka01-dev:2181 sessionTimeout=30000 watcher=org.apache.curator.ConnectionState@a5c7c18
2014-10-16 10:06:03,777 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.curator.discovery.ServerDiscoverySelector.start() throws java.lang.Exception] on object[io.druid.curator.discovery.ServerDiscoverySelector@34168b4].
2014-10-16 10:06:03,847 INFO [main-SendThread(kafka01-dev:2181)] org.apache.zookeeper.ClientCnxn - Opening socket connection to server kafka01-dev. Will not attempt to authenticate using SASL (unknown error)
2014-10-16 10:06:03,851 INFO [main-SendThread(kafka01-dev:2181)] org.apache.zookeeper.ClientCnxn - Socket connection established to kafka01-dev, initiating session
2014-10-16 10:06:03,861 WARN [main-SendThread(kafka01-dev:2181)] org.apache.zookeeper.ClientCnxnSocket - Connected to an old server; r-o mode will be unavailable
2014-10-16 10:06:03,861 INFO [main-SendThread(kafka01-dev:2181)] org.apache.zookeeper.ClientCnxn - Session establishment complete on server kafka01-dev, sessionid = 0x148ef0ba0bb069f, negotiated timeout = 30000
2014-10-16 10:06:03,864 INFO [main-EventThread] org.apache.curator.framework.state.ConnectionStateManager - State change: CONNECTED
2014-10-16 10:06:03,930 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.curator.announcement.Announcer.start()] on object[io.druid.curator.announcement.Announcer@1151848e].
2014-10-16 10:06:03,931 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.client.ServerInventoryView.start() throws java.lang.Exception] on object[io.druid.client.SingleServerInventoryView@6b950106].
2014-10-16 10:06:03,934 INFO [main] org.eclipse.jetty.server.Server - jetty-9.2.2.v20140723
2014-10-16 10:06:04,012 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/servedSegments/druid-historic01-dev:8091
2014-10-16 10:06:04,013 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Starting inventory cache for druid-historic01-dev:8091, inventoryPath /druid/servedSegments/druid-historic01-dev:8091
2014-10-16 10:06:04,013 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - New Server[DruidServerMetadata{name='druid-historic01-dev:8091', host='druid-historic01-dev:8091', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
2014-10-16 10:06:04,013 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/servedSegments/druid-realtime01-dev:8083
2014-10-16 10:06:04,014 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Starting inventory cache for druid-realtime01-dev:8083, inventoryPath /druid/servedSegments/druid-realtime01-dev:8083
2014-10-16 10:06:04,014 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - New Server[DruidServerMetadata{name='druid-realtime01-dev:8083', host='druid-realtime01-dev:8083', maxSize=0, tier='_default_tier', type='realtime', priority='10'}]
2014-10-16 10:06:04,014 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/servedSegments/druid-historic01-dev:8081
2014-10-16 10:06:04,014 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Starting inventory cache for druid-historic01-dev:8081, inventoryPath /druid/servedSegments/druid-historic01-dev:8081
2014-10-16 10:06:04,014 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - New Server[DruidServerMetadata{name='druid-historic01-dev:8081', host='druid-historic01-dev:8081', maxSize=53687091200, tier='_default_tier', type='historical', priority='0'}]
2014-10-16 10:06:04,014 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/servedSegments/druid-historic01-dev:8088
2014-10-16 10:06:04,014 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Starting inventory cache for druid-historic01-dev:8088, inventoryPath /druid/servedSegments/druid-historic01-dev:8088
2014-10-16 10:06:04,014 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - New Server[DruidServerMetadata{name='druid-historic01-dev:8088', host='druid-historic01-dev:8088', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
2014-10-16 10:06:04,015 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/servedSegments/druid-historic01-dev:8089
2014-10-16 10:06:04,015 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Starting inventory cache for druid-historic01-dev:8089, inventoryPath /druid/servedSegments/druid-historic01-dev:8089
2014-10-16 10:06:04,015 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - New Server[DruidServerMetadata{name='druid-historic01-dev:8089', host='druid-historic01-dev:8089', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
2014-10-16 10:06:04,015 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/servedSegments/druid-historic01-dev:8092
2014-10-16 10:06:04,015 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Starting inventory cache for druid-historic01-dev:8092, inventoryPath /druid/servedSegments/druid-historic01-dev:8092
2014-10-16 10:06:04,015 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - New Server[DruidServerMetadata{name='druid-historic01-dev:8092', host='druid-historic01-dev:8092', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
2014-10-16 10:06:04,015 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/servedSegments/druid-historic01-dev:8093
2014-10-16 10:06:04,016 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Starting inventory cache for druid-historic01-dev:8093, inventoryPath /druid/servedSegments/druid-historic01-dev:8093
2014-10-16 10:06:04,016 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - New Server[DruidServerMetadata{name='druid-historic01-dev:8093', host='druid-historic01-dev:8093', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
Oct 16, 2014 10:06:04 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider as a provider class
Oct 16, 2014 10:06:04 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering io.druid.server.StatusResource as a root resource class
Oct 16, 2014 10:06:04 AM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.17.1 02/28/2013 12:47 PM'
2014-10-16 10:06:04,140 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-realtime01-dev:8083] added segment[crud_historic_toolbars_2014-06-20T00:00:00.000Z_2014-06-21T00:00:00.000Z_2014-06-20T00:00:00.000Z]
2014-10-16 10:06:04,144 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-realtime01-dev:8083] added segment[crud_historic_toolbars_2014-09-11T00:00:00.000Z_2014-09-12T00:00:00.000Z_2014-09-11T00:00:00.000Z]
2014-10-16 10:06:04,145 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-realtime01-dev:8083] added segment[crud_historic_toolbars_2014-07-24T00:00:00.000Z_2014-07-25T00:00:00.000Z_2014-07-24T00:00:00.000Z]
2014-10-16 10:06:04,146 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-realtime01-dev:8083] added segment[crud_historic_toolbars_2014-10-08T00:00:00.000Z_2014-10-09T00:00:00.000Z_2014-10-08T00:00:00.000Z]
2014-10-16 10:06:04,148 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-14T12:39:00.000Z_2014-10-14T12:40:00.000Z_2014-10-14T12:39:13.753Z]
2014-10-16 10:06:04,153 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-09-09T00:00:00.000Z_2014-09-10T00:00:00.000Z_2014-09-09T00:00:00.000Z]
2014-10-16 10:06:04,153 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-14T09:39:00.000Z_2014-10-14T09:40:00.000Z_2014-10-14T09:39:59.417Z_1]
2014-10-16 10:06:04,174 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-08-20T00:00:00.000Z_2014-08-21T00:00:00.000Z_2014-08-20T00:00:00.000Z]
2014-10-16 10:06:04,175 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T09:16:00.000Z_2014-10-16T09:17:00.000Z_2014-10-16T09:17:12.579Z]
2014-10-16 10:06:04,176 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T09:51:00.000Z_2014-10-16T09:52:00.000Z_2014-10-16T09:52:09.213Z]
2014-10-16 10:06:04,184 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T09:09:00.000Z_2014-10-16T09:10:00.000Z_2014-10-16T09:10:17.454Z]
2014-10-16 10:06:04,186 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T08:51:00.000Z_2014-10-16T08:52:00.000Z_2014-10-16T08:53:05.200Z]
2014-10-16 10:06:04,190 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T08:24:00.000Z_2014-10-16T08:25:00.000Z_2014-10-16T08:26:05.399Z]
2014-10-16 10:06:04,190 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-14T13:22:00.000Z_2014-10-14T13:23:00.000Z_2014-10-14T13:22:14.398Z]
2014-10-16 10:06:04,194 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T09:54:00.000Z_2014-10-16T09:55:00.000Z_2014-10-16T09:55:14.737Z]
2014-10-16 10:06:04,195 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-08-12T00:00:00.000Z_2014-08-13T00:00:00.000Z_2014-08-12T00:00:00.000Z]
2014-10-16 10:06:04,197 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T08:11:00.000Z_2014-10-16T08:12:00.000Z_2014-10-16T08:11:06.288Z]
2014-10-16 10:06:04,199 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-14T11:06:00.000Z_2014-10-14T11:07:00.000Z_2014-10-14T11:30:39.274Z]
2014-10-16 10:06:04,200 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-10T14:22:00.000Z_2014-10-10T14:23:00.000Z_2014-10-10T14:23:01.575Z]
2014-10-16 10:06:04,204 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T09:05:00.000Z_2014-10-16T09:06:00.000Z_2014-10-16T09:06:07.381Z]
2014-10-16 10:06:04,207 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-14T13:35:00.000Z_2014-10-14T13:36:00.000Z_2014-10-14T13:35:13.124Z]
2014-10-16 10:06:04,208 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-10T12:44:00.000Z_2014-10-10T12:45:00.000Z_2014-10-10T12:44:31.741Z]
2014-10-16 10:06:04,209 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-10T15:34:00.000Z_2014-10-10T15:35:00.000Z_2014-10-10T15:34:37.818Z]
2014-10-16 10:06:04,210 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-10T11:17:00.000Z_2014-10-10T11:18:00.000Z_2014-10-10T11:17:38.254Z]
2014-10-16 10:06:04,216 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-10T14:04:00.000Z_2014-10-10T14:05:00.000Z_2014-10-10T14:07:38.070Z]
Oct 16, 2014 10:06:04 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider to GuiceManagedComponentProvider with the scope "Singleton"
2014-10-16 10:06:04,229 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-14T11:10:00.000Z_2014-10-14T11:11:00.000Z_2014-10-14T11:30:39.132Z]
2crud_historic_toolbars_2014-10-16T09:30:00.000Z_2014-10-16T09:31:00.000Z_2014-10-16T09:31:08.792Z]
2014-10-16 10:06:04,233 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T09:44:00.000Z_2014-10-16T09:45:00.000Z_2014-10-16T09:45:10.864Z]
crud_historic_toolbars_2014-05-05T00:00:00.000Z_2014-05-06T00:00:00.000Z_2014-05-05T00:00:00.000Z]
2014-10-16 10:06:04,239 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[ttb_advertiser_csv_2014-04-26T00:00:00.000Z_2014-04-27T00:00:00.000Z_2014-05-20T15:55:39.353Z]
2014-10-16 10:06:04,240 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-10T08:10:00.000Z_2014-10-10T08:11:00.000Z_2014-10-10T08:11:14.302Z]
2014-10-16 10:06:04,240 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_toolbars_high_resolution_2014-06-09T00:00:00.000Z_2014-06-10T00:00:00.000Z_2014-06-09T00:00:00.000Z]
2014-10-16 10:06:04,241 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-10T14:43:00.000Z_2014-10-10T14:44:00.000Z_2014-10-10T14:44:17.303Z]
2014-10-16 10:06:04,242 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[ttb_advertiser_csv_2014-04-15T00:00:00.000Z_2014-04-16T00:00:00.000Z_2014-05-20T15:55:39.353Z]
2014-10-16 10:06:04,242 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-10T14:09:00.000Z_2014-10-10T14:10:00.000Z_2014-10-10T14:09:59.527Z]
2014-10-16 10:06:04,243 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-15T15:40:00.000Z_2014-10-15T15:41:00.000Z_2014-10-15T15:40:44.981Z]
2014-10-16 10:06:04,244 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[ttb_advertiser_csv_2014-07-04T23:00:00.000Z_2014-07-05T00:00:00.000Z_2014-07-08T16:01:43.074Z]
2014-10-16 10:06:04,244 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[ttb_advertiser_csv_2014-04-01T14:00:00.000Z_2014-04-01T15:00:00.000Z_2014-09-11T10:17:21.764Z]
2014-10-16 10:06:04,245 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-14T08:15:00.000Z_2014-10-14T08:16:00.000Z_2014-10-14T08:15:47.384Z]
2014-10-16 10:06:04,246 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[ttb_advertiser_csv_2014-04-02T00:00:00.000Z_2014-04-02T01:00:00.000Z_2014-09-11T10:17:21.764Z]
2014-10-16 10:06:04,246 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-14T08:14:00.000Z_2014-10-14T08:15:00.000Z_2014-10-14T08:14:28.778Z]
2014-10-16 10:06:04,254 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-09T15:15:00.000Z_2014-10-09T15:16:00.000Z_2014-10-09T15:16:05.995Z]
2014-10-16 10:06:04,263 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-14T13:36:00.000Z_2014-10-14T13:37:00.000Z_2014-10-14T13:36:14.670Z]
2014-10-16 10:06:04,265 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T09:23:00.000Z_2014-10-16T09:24:00.000Z_2014-10-16T09:24:09.543Z]
crud_historic_toolbars - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T09:02:00.000Z_2014-10-16T09:03:00.000Z_2014-10-16T09:03:05.903Z]
2014-10-16 10:06:04,273 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-10T12:31:00.000Z_2014-10-10T12:32:00.000Z_2014-10-10T12:32:53.391Z]
2014-10-16 10:06:04,274 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-08-01T00:00:00.000Z_2014-08-02T00:00:00.000Z_2014-08-01T00:00:00.000Z]
2014-10-16 10:06:04,281 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T08:58:00.000Z_2014-10-16T08:59:00.000Z_2014-10-16T08:59:14.590Z]
2014-10-16 10:06:04,282 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[ttb_advertiser_csv_2014-04-02T02:00:00.000Z_2014-04-02T03:00:00.000Z_2014-09-11T10:17:21.764Z]
2014-10-16 10:06:04,282 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-04-28T00:00:00.000Z_2014-04-29T00:00:00.000Z_2014-04-28T00:00:00.000Z]
crud_historic_toolbars_2014-06-10T00:00:00.000Z_2014-06-11T00:00:00.000Z_2014-06-10T00:00:00.000Z]
2014-10-16 10:06:04,298 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-09T12:19:00.000Z_2014-10-09T12:20:00.000Z_2014-10-09T12:19:05.905Z]
2014-10-16 10:06:04,304 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-06-19T00:00:00.000Z_2014-06-20T00:00:00.000Z_2014-06-19T00:00:00.000Z]
2014-10-16 10:06:04,307 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T09:26:00.000Z_2014-10-16T09:27:00.000Z_2014-10-16T09:27:08.432Z]
2014-10-16 10:06:04,307 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-09T12:51:00.000Z_2014-10-09T12:52:00.000Z_2014-10-09T12:56:47.034Z]
2014-10-16 10:06:04,308 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-09T13:43:00.000Z_2014-10-09T13:44:00.000Z_2014-10-09T13:43:14.978Z]
2014-10-16 10:06:04,312 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-05-06T00:00:00.000Z_2014-05-07T00:00:00.000Z_2014-05-06T00:00:00.000Z]
2014-10-16 10:06:04,314 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-14T11:09:00.000Z_2014-10-14T11:10:00.000Z_2014-10-14T11:09:44.715Z_1]
2014-10-16 10:06:04,316 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-14T10:42:00.000Z_2014-10-14T10:43:00.000Z_2014-10-14T10:42:56.855Z_1]
2014-10-16 10:06:04,320 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T09:19:00.000Z_2014-10-16T09:20:00.000Z_2014-10-16T09:20:10.982Z]
2014-10-16 10:06:04,321 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-10T14:23:00.000Z_2014-10-10T14:24:00.000Z_2014-10-10T14:23:47.921Z]
2014-10-16 10:06:04,334 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T09:40:00.000Z_2014-10-16T09:41:00.000Z_2014-10-16T09:41:09.030Z]
2014-10-16 10:06:04,334 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T09:37:00.000Z_2014-10-16T09:38:00.000Z_2014-10-16T09:38:09.519Z]
2014-10-16 10:06:04,336 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T09:47:00.000Z_2014-10-16T09:48:00.000Z_2014-10-16T09:48:15.268Z]
2014-10-16 10:06:04,337 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-10T15:36:00.000Z_2014-10-10T15:37:00.000Z_2014-10-10T15:36:19.537Z]
2014-10-16 10:06:04,345 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-09T09:00:00.000Z_2014-10-09T10:00:00.000Z_2014-10-09T09:10:09.440Z]
2014-10-16 10:06:04,346 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-10T15:37:00.000Z_2014-10-10T15:38:00.000Z_2014-10-10T15:37:15.054Z]
2014-10-16 10:06:04,349 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-14T10:26:00.000Z_2014-10-14T10:27:00.000Z_2014-10-14T10:27:02.082Z]
2014-10-16 10:06:04,351 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-09T14:19:00.000Z_2014-10-09T14:20:00.000Z_2014-10-09T14:20:22.254Z]
2014-10-16 10:06:04,352 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-14T10:38:00.000Z_2014-10-14T10:39:00.000Z_2014-10-14T10:38:51.201Z_1]
2014-10-16 10:06:04,360 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-15T14:29:00.000Z_2014-10-15T14:30:00.000Z_2014-10-15T14:29:55.801Z]
2014-10-16 10:06:04,361 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T09:33:00.000Z_2014-10-16T09:34:00.000Z_2014-10-16T09:34:08.237Z]
2014-10-16 10:06:04,364 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-14T13:23:00.000Z_2014-10-14T13:24:00.000Z_2014-10-14T13:23:41.561Z]
2014-10-16 10:06:04,366 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T09:12:00.000Z_2014-10-16T09:13:00.000Z_2014-10-16T09:13:08.383Z]
2014-10-16 10:06:04,367 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-10T15:35:00.000Z_2014-10-10T15:36:00.000Z_2014-10-10T15:36:11.048Z]
2014-10-16 10:06:04,367 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-15T14:21:00.000Z_2014-10-15T14:22:00.000Z_2014-10-15T14:21:58.176Z]
2014-10-16 10:06:04,381 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T08:10:00.000Z_2014-10-16T08:11:00.000Z_2014-10-16T08:10:15.953Z]
2014-10-16 10:06:04,382 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-10T11:05:00.000Z_2014-10-10T11:06:00.000Z_2014-10-10T11:05:45.287Z]
2014-10-16 10:06:04,383 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-10T14:29:00.000Z_2014-10-10T14:30:00.000Z_2014-10-10T14:29:37.711Z]
2014-10-16 10:06:04,393 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T08:27:00.000Z_2014-10-16T08:28:00.000Z_2014-10-16T08:29:04.563Z]
2014-10-16 10:06:04,395 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-08-26T00:00:00.000Z_2014-08-27T00:00:00.000Z_2014-08-26T00:00:00.000Z]
2014-10-16 10:06:04,396 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-06-09T00:00:00.000Z_2014-06-10T00:00:00.000Z_2014-06-09T00:00:00.000Z]
2014-10-16 10:06:04,403 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-08-21T00:00:00.000Z_2014-08-22T00:00:00.000Z_2014-08-21T00:00:00.000Z]
2014-10-16 10:06:04,403 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8088] added segment[crud_historic_toolbars_2014-10-16T09:58:00.000Z_2014-10-16T09:59:00.000Z_2014-10-16T09:59:09.120Z]
2014-10-16 10:06:04,404 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8089] added segment[crud_historic_toolbars_2014-10-16T10:02:00.000Z_2014-10-16T10:03:00.000Z_2014-10-16T10:02:11.473Z]
Oct 16, 2014 10:06:04 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding io.druid.server.QueryResource to GuiceInstantiatedComponentProvider
Oct 16, 2014 10:06:04 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding io.druid.segment.realtime.firehose.ChatHandlerResource to GuiceInstantiatedComponentProvider
Oct 16, 2014 10:06:04 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding io.druid.server.StatusResource to GuiceManagedComponentProvider with the scope "Undefined"
2014-10-16 10:06:04,602 INFO [main] org.eclipse.jetty.server.handler.ContextHandler - Started o.e.j.s.ServletContextHandler@14d9eec4{/,null,AVAILABLE}
2014-10-16 10:06:04,608 INFO [main] org.eclipse.jetty.server.ServerConnector - Started ServerConnector@62e17d2b{HTTP/1.1}{0.0.0.0:8090}
2014-10-16 10:06:04,608 INFO [main] org.eclipse.jetty.server.Server - Started @3760ms
2014-10-16 10:06:04,608 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.server.coordination.AbstractDataSegmentAnnouncer.start()] on object[io.druid.server.coordination.SingleDataSegmentAnnouncer@35afff3b].
2014-10-16 10:06:04,608 INFO [main] io.druid.server.coordination.AbstractDataSegmentAnnouncer - Announcing self[DruidServerMetadata{name='druid-historic01-dev:8090', host='druid-historic01-dev:8090', maxSize=0, tier='_default_tier', type='realtime', priority='0'}] at [/druid/announcements/druid-historic01-dev:8090]
2014-10-16 10:06:04,636 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.server.coordination.AbstractDataSegmentAnnouncer.start()] on object[io.druid.server.coordination.BatchDataSegmentAnnouncer@21ba659c].
2014-10-16 10:06:04,638 INFO [main] io.druid.server.coordination.AbstractDataSegmentAnnouncer - Announcing self[DruidServerMetadata{name='druid-historic01-dev:8090', host='druid-historic01-dev:8090', maxSize=0, tier='_default_tier', type='realtime', priority='0'}] at [/druid/announcements/druid-historic01-dev:8090]
2014-10-16 10:06:04,639 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.indexing.worker.executor.ExecutorLifecycle.start()] on object[io.druid.indexing.worker.executor.ExecutorLifecycle@3056bfb9].
2014-10-16 10:06:04,677 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/servedSegments/druid-historic01-dev:8090
2014-10-16 10:06:04,677 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Starting inventory cache for druid-historic01-dev:8090, inventoryPath /druid/servedSegments/druid-historic01-dev:8090
2014-10-16 10:06:04,677 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - New Server[DruidServerMetadata{name='druid-historic01-dev:8090', host='druid-historic01-dev:8090', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
2014-10-16 10:06:04,795 INFO [main] io.druid.indexing.worker.executor.ExecutorLifecycle - Running with task: {
  "type" : "index_realtime",
  "id" : "index_realtime_crud_historic_toolbars_2014-10-16T10:05:00.000Z_0_0_ehglngel",
  "resource" : {
    "availabilityGroup" : "crud_historic_toolbars-05-0000",
          "type" : "none"
        },
        "intervals" : null
      }
    },
    "ioConfig" : {
      "type" : "realtime",
      "firehose" : {
        "type" : "clipped",
        "delegate" : {
          "type" : "timed",
          "delegate" : {
            "type" : "receiver",
            "serviceName" : "druid:local:firehose:crud_historic_toolbars-05-0000-0000",
            "bufferSize" : 100000,
            "parser" : {
              "type" : "map",
              "parseSpec" : {
                "format" : "json",
                "timestampSpec" : {
                  "column" : "created_at",
                  "format" : "iso"
                },
                "dimensionsSpec" : {
                  "dimensions" : [ "affiliate_id", "country_id", "promo_id", "app", "traffic_source_id", "toolbar_id", "channel_id", "advertiser_channel", "created_at" ],
                  "dimensionExclusions" : [ "error_90", "installs" ],
                  "spatialDimensions" : [ ]
                }
              }
            }
          },
          "shutoffTime" : "2014-10-16T10:12:00.000Z"
        },
        "interval" : "2014-10-16T10:05:00.000Z/2014-10-16T10:06:00.000Z"
      }
    },
    "tuningConfig" : {
      "type" : "realtime",
      "maxRowsInMemory" : 75000,
      "intermediatePersistPeriod" : "PT1M",
      "windowPeriod" : "PT1M",
      "basePersistDirectory" : "/tmp/1413446170221-0",
      "versioningPolicy" : {
        "type" : "intervalStart"
      },
      "maxPendingPersists" : 1,
      "shardSpec" : {
        "type" : "linear",
        "partitionNum" : 0
      },
      "rejectionPolicyFactory" : {
        "type" : "none"
      }
    }
  },
  "groupId" : "index_realtime_crud_historic_toolbars",
  "dataSource" : "crud_historic_toolbars"
}
2014-10-16 10:06:04,803 INFO [task-runner-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Running task: index_realtime_crud_historic_toolbars_2014-10-16T10:05:00.000Z_0_0_ehglngel
2014-10-16 10:06:04,804 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Performing action for task[index_realtime_crud_historic_toolbars_2014-10-16T10:05:00.000Z_0_0_ehglngel]: LockListAction{}
2014-10-16 10:06:04,810 INFO [task-runner-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[index_realtime_crud_historic_toolbars_2014-10-16T10:05:00.000Z_0_0_ehglngel] to overlord[http://druid-historic01-dev:8087/druid/indexer/v1/action]: LockListAction{}
2014-10-16 10:06:04,820 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid-historic01-dev:8087
2014-10-16 10:06:04,844 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid-historic01-dev:8087
2014-10-16 10:06:04,844 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid-historic01-dev:8087
2014-10-16 10:06:04,844 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid-historic01-dev:8087
2014-10-16 10:06:04,845 INFO [task-runner-0] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://druid-historic01-dev:8087
2014-10-16 10:06:04,875 INFO [task-runner-0] io.druid.segment.realtime.firehose.EventReceiverFirehoseFactory - Connecting firehose: druid:local:firehose:crud_historic_toolbars-05-0000-0000
2014-10-16 10:06:04,875 INFO [task-runner-0] io.druid.segment.realtime.firehose.EventReceiverFirehoseFactory - Found chathandler of class[io.druid.segment.realtime.firehose.ServiceAnnouncingChatHandlerProvider]
2014-10-16 10:06:04,876 INFO [task-runner-0] io.druid.segment.realtime.firehose.ServiceAnnouncingChatHandlerProvider - Registering Eventhandler[druid:local:firehose:crud_historic_toolbars-05-0000-0000]
2014-10-16 10:06:04,878 INFO [task-runner-0] io.druid.curator.discovery.CuratorServiceAnnouncer - Announcing service[DruidNode{serviceName='druid:local:firehose:crud_historic_toolbars-05-0000-0000', host='druid-historic01-dev:8090', port=8090}]
2014-10-16 10:06:04,909 INFO [task-runner-0] io.druid.segment.realtime.firehose.ServiceAnnouncingChatHandlerProvider - Registering Eventhandler[crud_historic_toolbars-05-0000-0000]
2014-10-16 10:06:04,910 INFO [task-runner-0] io.druid.curator.discovery.CuratorServiceAnnouncer - Announcing service[DruidNode{serviceName='crud_historic_toolbars-05-0000-0000', host='druid-historic01-dev:8090', port=8090}]
2014-10-16 10:06:05,139 INFO [task-runner-0] io.druid.data.input.FirehoseFactory - Firehose created, will shut down at: 2014-10-16T10:12:00.000Z
2014-10-16 10:06:05,145 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Creating plumber using rejectionPolicy[io.druid.segment.realtime.plumber.NoopRejectionPolicyFactory$1@11cef7bc]
2014-10-16 10:06:05,148 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Expect to run at [2014-10-16T10:08:00.000Z]
2014-10-16 10:06:05,150 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Starting merge and push.
2014-10-16 10:06:05,150 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks. minTimestamp [1970-01-01T00:00:00.000Z]
2014-10-16 10:06:05,150 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks to persist and merge
2014-10-16 10:06:05,219 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T09:58:00.000Z_2014-10-16T09:59:00.000Z_2014-10-16T09:59:09.120Z]
2014-10-16 10:06:05,259 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8088] removed segment[crud_historic_toolbars_2014-10-16T09:58:00.000Z_2014-10-16T09:59:00.000Z_2014-10-16T09:59:09.120Z]
2014-10-16 10:06:05,299 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Closing inventory cache for druid-historic01-dev:8088. Also removing listeners.
2014-10-16 10:06:05,301 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server Disappeared[DruidServerMetadata{name='druid-historic01-dev:8088', host='druid-historic01-dev:8088', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
2014-10-16 10:07:00,237 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Closing inventory cache for druid-historic01-dev:8091. Also removing listeners.
2014-10-16 10:07:00,237 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server Disappeared[DruidServerMetadata{name='druid-historic01-dev:8091', host='druid-historic01-dev:8091', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
2014-10-16 10:07:05,162 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-16T10:07:05.152Z","service":"overlord","host":"druid-historic01-dev:8090","metric":"events/thrownAway","value":0,"user2":"crud_historic_toolbars"}]
2014-10-16 10:07:05,162 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-16T10:07:05.162Z","service":"overlord","host":"druid-historic01-dev:8090","metric":"events/unparseable","value":0,"user2":"crud_historic_toolbars"}]
2014-10-16 10:07:05,162 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-16T10:07:05.162Z","service":"overlord","host":"druid-historic01-dev:8090","metric":"events/processed","value":0,"user2":"crud_historic_toolbars"}]
2014-10-16 10:07:05,162 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-16T10:07:05.162Z","service":"overlord","host":"druid-historic01-dev:8090","metric":"rows/output","value":0,"user2":"crud_historic_toolbars"}]
2014-10-16 10:07:15,941 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/servedSegments/druid-historic01-dev:8088
2014-10-16 10:07:15,941 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Starting inventory cache for druid-historic01-dev:8088, inventoryPath /druid/servedSegments/druid-historic01-dev:8088
2014-10-16 10:07:15,941 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - New Server[DruidServerMetadata{name='druid-historic01-dev:8088', host='druid-historic01-dev:8088', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
2014-10-16 10:07:16,509 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8088] added segment[crud_historic_toolbars_2014-10-16T10:07:00.000Z_2014-10-16T10:08:00.000Z_2014-10-16T10:07:16.380Z]
2014-10-16 10:08:00,002 INFO [crud_historic_toolbars-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Starting merge and push.
2014-10-16 10:08:00,003 INFO [crud_historic_toolbars-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks. minTimestamp [1970-01-01T00:00:00.000Z]
2014-10-16 10:08:00,003 INFO [crud_historic_toolbars-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks to persist and merge
2014-10-16 10:08:05,151 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-16T10:08:05.151Z","service":"overlord","host":"druid-historic01-dev:8090","metric":"events/thrownAway","value":0,"user2":"crud_historic_toolbars"}]
2014-10-16 10:08:05,151 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-16T10:08:05.151Z","service":"overlord","host":"druid-historic01-dev:8090","metric":"events/unparseable","value":0,"user2":"crud_historic_toolbars"}]
2014-10-16 10:08:05,152 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-16T10:08:05.151Z","service":"overlord","host":"druid-historic01-dev:8090","metric":"events/processed","value":0,"user2":"crud_historic_toolbars"}]
2014-10-16 10:08:05,152 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-16T10:08:05.152Z","service":"overlord","host":"druid-historic01-dev:8090","metric":"rows/output","value":0,"user2":"crud_historic_toolbars"}]
2014-10-16 10:08:05,336 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/servedSegments/druid-historic01-dev:8091
2014-10-16 10:08:05,336 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Starting inventory cache for druid-historic01-dev:8091, inventoryPath /druid/servedSegments/druid-historic01-dev:8091
2014-10-16 10:08:05,336 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - New Server[DruidServerMetadata{name='druid-historic01-dev:8091', host='druid-historic01-dev:8091', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
2014-10-16 10:09:00,002 INFO [crud_historic_toolbars-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Starting merge and push.
2014-10-16 10:09:00,002 INFO [crud_historic_toolbars-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks. minTimestamp [1970-01-01T00:00:00.000Z]
2014-10-16 10:09:00,003 INFO [crud_historic_toolbars-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks to persist and merge
2014-10-16 10:09:05,151 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-16T10:09:05.151Z","service":"overlord","host":"druid-historic01-dev:8090","metric":"events/thrownAway","value":0,"user2":"crud_historic_toolbars"}]
2014-10-16 10:09:05,152 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-16T10:09:05.151Z","service":"overlord","host":"druid-historic01-dev:8090","metric":"events/unparseable","value":0,"user2":"crud_historic_toolbars"}]
2014-10-16 10:09:05,152 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-16T10:09:05.152Z","service":"overlord","host":"druid-historic01-dev:8090","metric":"events/processed","value":0,"user2":"crud_historic_toolbars"}]
2014-10-16 10:09:05,152 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-16T10:09:05.152Z","service":"overlord","host":"druid-historic01-dev:8090","metric":"rows/output","value":0,"user2":"crud_historic_toolbars"}]
2014-10-16 10:10:00,004 INFO [crud_historic_toolbars-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Starting merge and push.
2014-10-16 10:10:00,004 INFO [crud_historic_toolbars-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks. minTimestamp [1970-01-01T00:00:00.000Z]
2014-10-16 10:10:00,004 INFO [crud_historic_toolbars-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks to persist and merge
2014-10-16 10:10:00,229 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Closing inventory cache for druid-historic01-dev:8092. Also removing listeners.
2014-10-16 10:10:00,229 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server Disappeared[DruidServerMetadata{name='druid-historic01-dev:8092', host='druid-historic01-dev:8092', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
2014-10-16 10:10:04,994 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/servedSegments/druid-historic01-dev:8092
2014-10-16 10:10:04,994 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Starting inventory cache for druid-historic01-dev:8092, inventoryPath /druid/servedSegments/druid-historic01-dev:8092
2014-10-16 10:10:04,994 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - New Server[DruidServerMetadata{name='druid-historic01-dev:8092', host='druid-historic01-dev:8092', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
2014-10-16 10:10:05,151 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-16T10:10:05.151Z","service":"overlord","host":"druid-historic01-dev:8090","metric":"events/thrownAway","value":0,"user2":"crud_historic_toolbars"}]
2014-10-16 10:10:05,152 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-16T10:10:05.152Z","service":"overlord","host":"druid-historic01-dev:8090","metric":"events/unparseable","value":0,"user2":"crud_historic_toolbars"}]
2014-10-16 10:10:05,152 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-16T10:10:05.152Z","service":"overlord","host":"druid-historic01-dev:8090","metric":"events/processed","value":0,"user2":"crud_historic_toolbars"}]
2014-10-16 10:10:05,152 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-16T10:10:05.152Z","service":"overlord","host":"druid-historic01-dev:8090","metric":"rows/output","value":0,"user2":"crud_historic_toolbars"}]
2014-10-16 10:10:05,445 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8081] added segment[crud_historic_toolbars_2014-10-16T10:02:00.000Z_2014-10-16T10:03:00.000Z_2014-10-16T10:02:11.473Z]
2014-10-16 10:10:05,460 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8089] removed segment[crud_historic_toolbars_2014-10-16T10:02:00.000Z_2014-10-16T10:03:00.000Z_2014-10-16T10:02:11.473Z]
2014-10-16 10:10:05,536 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Closing inventory cache for druid-historic01-dev:8089. Also removing listeners.
2014-10-16 10:10:05,536 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server Disappeared[DruidServerMetadata{name='druid-historic01-dev:8089', host='druid-historic01-dev:8089', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
2014-10-16 10:10:26,658 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/servedSegments/druid-historic01-dev:8089
2014-10-16 10:10:26,658 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Starting inventory cache for druid-historic01-dev:8089, inventoryPath /druid/servedSegments/druid-historic01-dev:8089
2014-10-16 10:10:26,658 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - New Server[DruidServerMetadata{name='druid-historic01-dev:8089', host='druid-historic01-dev:8089', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
2014-10-16 10:10:27,110 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server[druid-historic01-dev:8089] added segment[crud_historic_toolbars_2014-10-16T10:10:00.000Z_2014-10-16T10:11:00.000Z_2014-10-16T10:10:27.070Z]
2014-10-16 10:11:00,005 INFO [crud_historic_toolbars-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Starting merge and push.
2014-10-16 10:11:00,005 INFO [crud_historic_toolbars-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks. minTimestamp [1970-01-01T00:00:00.000Z]
2014-10-16 10:11:00,005 INFO [crud_historic_toolbars-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks to persist and merge
2014-10-16 10:11:00,236 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Closing inventory cache for druid-historic01-dev:8093. Also removing listeners.
2014-10-16 10:11:00,236 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server Disappeared[DruidServerMetadata{name='druid-historic01-dev:8093', host='druid-historic01-dev:8093', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
2014-10-16 10:11:05,152 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-16T10:11:05.151Z","service":"overlord","host":"druid-historic01-dev:8090","metric":"events/thrownAway","value":0,"user2":"crud_historic_toolbars"}]
2014-10-16 10:11:05,152 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-16T10:11:05.152Z","service":"overlord","host":"druid-historic01-dev:8090","metric":"events/unparseable","value":0,"user2":"crud_historic_toolbars"}]
2014-10-16 10:11:05,152 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-16T10:11:05.152Z","service":"overlord","host":"druid-historic01-dev:8090","metric":"events/processed","value":0,"user2":"crud_historic_toolbars"}]
2014-10-16 10:11:05,152 INFO [MonitorScheduler-0] com.metamx.emitter.core.LoggingEmitter - Event [{"feed":"metrics","timestamp":"2014-10-16T10:11:05.152Z","service":"overlord","host":"druid-historic01-dev:8090","metric":"rows/output","value":0,"user2":"crud_historic_toolbars"}]
2014-10-16 10:11:06,569 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Created new InventoryCacheListener for /druid/servedSegments/druid-historic01-dev:8093
2014-10-16 10:11:06,569 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Starting inventory cache for druid-historic01-dev:8093, inventoryPath /druid/servedSegments/druid-historic01-dev:8093
2014-10-16 10:11:06,569 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - New Server[DruidServerMetadata{name='druid-historic01-dev:8093', host='druid-historic01-dev:8093', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
2014-10-16 10:12:00,000 INFO [timed-shutoff-firehose-0] io.druid.data.input.FirehoseFactory - Closing delegate firehose.
2014-10-16 10:12:00,000 INFO [timed-shutoff-firehose-0] io.druid.segment.realtime.firehose.EventReceiverFirehoseFactory - Firehose closing.
2014-10-16 10:12:00,001 INFO [timed-shutoff-firehose-0] io.druid.segment.realtime.firehose.ServiceAnnouncingChatHandlerProvider - Unregistering chat handler[druid:local:firehose:crud_historic_toolbars-05-0000-0000]
2014-10-16 10:12:00,001 INFO [timed-shutoff-firehose-0] io.druid.curator.discovery.CuratorServiceAnnouncer - Unannouncing service[DruidNode{serviceName='druid:local:firehose:crud_historic_toolbars-05-0000-0000', host='druid-historic01-dev:8090', port=8090}]
2014-10-16 10:12:00,005 INFO [crud_historic_toolbars-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Starting merge and push.
2014-10-16 10:12:00,005 INFO [crud_historic_toolbars-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks. minTimestamp [1970-01-01T00:00:00.000Z]
2014-10-16 10:12:00,005 INFO [crud_historic_toolbars-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks to persist and merge
2014-10-16 10:12:00,247 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Submitting persist runnable for dataSource[crud_historic_toolbars]
2014-10-16 10:12:00,248 INFO [task-runner-0] io.druid.segment.realtime.plumber.RealtimePlumber - Shutting down...
2014-10-16 10:12:00,248 INFO [task-runner-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Removing task directory: /tmp/persistent/task/index_realtime_crud_historic_toolbars_2014-10-16T10:05:00.000Z_0_0_ehglngel/work
2014-10-16 10:12:00,258 INFO [task-runner-0] io.druid.indexing.worker.executor.ExecutorLifecycle - Task completed with status: {
  "id" : "index_realtime_crud_historic_toolbars_2014-10-16T10:05:00.000Z_0_0_ehglngel",
  "status" : "SUCCESS",
  "duration" : 355451
}
2014-10-16 10:12:00,261 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.server.coordination.AbstractDataSegmentAnnouncer.stop()] on object[io.druid.server.coordination.BatchDataSegmentAnnouncer@21ba659c].
2014-10-16 10:12:00,261 INFO [main] io.druid.server.coordination.AbstractDataSegmentAnnouncer - Stopping class io.druid.server.coordination.BatchDataSegmentAnnouncer with config[io.druid.server.initialization.ZkPathsConfig$$EnhancerByCGLIB$$4acb6b92@6d385084]
2014-10-16 10:12:00,261 INFO [main] io.druid.curator.announcement.Announcer - unannouncing [/druid/announcements/druid-historic01-dev:8090]
2014-10-16 10:12:00,272 INFO [ServerInventoryView-0] io.druid.curator.inventory.CuratorInventoryManager - Closing inventory cache for druid-historic01-dev:8090. Also removing listeners.
2014-10-16 10:12:00,272 INFO [ServerInventoryView-0] io.druid.client.SingleServerInventoryView - Server Disappeared[DruidServerMetadata{name='druid-historic01-dev:8090', host='druid-historic01-dev:8090', maxSize=0, tier='_default_tier', type='realtime', priority='0'}]
2014-10-16 10:12:00,276 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.server.coordination.AbstractDataSegmentAnnouncer.stop()] on object[io.druid.server.coordination.SingleDataSegmentAnnouncer@35afff3b].
2014-10-16 10:12:00,276 INFO [main] io.druid.server.coordination.AbstractDataSegmentAnnouncer - Stopping class io.druid.server.coordination.SingleDataSegmentAnnouncer with config[io.druid.server.initialization.ZkPathsConfig$$EnhancerByCGLIB$$4acb6b92@6d385084]
2014-10-16 10:12:00,276 INFO [main] io.druid.curator.announcement.Announcer - unannouncing [/druid/announcements/druid-historic01-dev:8090]
2014-10-16 10:12:00,276 ERROR [main] io.druid.curator.announcement.Announcer - Path[/druid/announcements/druid-historic01-dev:8090] not announced, cannot unannounce.
2014-10-16 10:12:00,277 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.indexing.worker.executor.ExecutorLifecycle.stop()] on object[io.druid.indexing.worker.executor.ExecutorLifecycle@3056bfb9].
2014-10-16 10:12:00,278 INFO [main] org.eclipse.jetty.server.ServerConnector - Stopped ServerConnector@62e17d2b{HTTP/1.1}{0.0.0.0:8090}
2014-10-16 10:12:00,280 INFO [main] org.eclipse.jetty.server.handler.ContextHandler - Stopped o.e.j.s.ServletContextHandler@14d9eec4{/,null,UNAVAILABLE}
2014-10-16 10:12:00,281 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.indexing.overlord.ThreadPoolTaskRunner.stop()] on object[io.druid.indexing.overlord.ThreadPoolTaskRunner@3e4f5a4c].
2014-10-16 10:12:00,281 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.client.ServerInventoryView.stop() throws java.io.IOException] on object[io.druid.client.SingleServerInventoryView@6b950106].
2014-10-16 10:12:00,281 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.curator.announcement.Announcer.stop()] on object[io.druid.curator.announcement.Announcer@1151848e].
2014-10-16 10:12:00,282 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.curator.discovery.ServerDiscoverySelector.stop() throws java.io.IOException] on object[io.druid.curator.discovery.ServerDiscoverySelector@34168b4].
2014-10-16 10:12:00,290 INFO [main] io.druid.curator.CuratorModule - Stopping Curator
2014-10-16 10:12:00,302 INFO [main-EventThread] org.apache.zookeeper.ClientCnxn - EventThread shut down
2014-10-16 10:12:00,302 INFO [main] org.apache.zookeeper.ZooKeeper - Session: 0x148ef0ba0bb069f closed
2014-10-16 10:12:00,303 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void com.metamx.http.client.HttpClient.stop()] on object[com.metamx.http.client.HttpClient@3a71309f].
2014-10-16 10:12:00,307 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void com.metamx.metrics.MonitorScheduler.stop()] on object[com.metamx.metrics.MonitorScheduler@cd849db].
2014-10-16 10:12:00,307 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void com.metamx.emitter.service.ServiceEmitter.close() throws java.io.IOException] on object[com.metamx.emitter.service.ServiceEmitter@1bc0c549].
2014-10-16 10:12:00,307 INFO [main] com.metamx.emitter.core.LoggingEmitter - Close: started [false]
2014-10-16 10:12:00,308 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void com.metamx.emitter.core.LoggingEmitter.close() throws java.io.IOException] on object[com.metamx.emitter.core.LoggingEmitter@26467d78].


Fangjin Yang

unread,
Oct 16, 2014, 10:29:36 PM10/16/14
to druid-de...@googlegroups.com
Do you have the full logs of the task or any similar task where events fail to be propagated? I think you only provided the first few lines.
...

Gian Merlino

unread,
Oct 17, 2014, 1:52:15 AM10/17/14
to druid-de...@googlegroups.com
Interesting that both times you got this error, the peon was on port 8090. Is it possible you have something else listening on port 8090, or something intercepting connections to 8090? Do peons on other ports seem to work?
Reply all
Reply to author
Forward
0 new messages