IllegalArgumentException instantiating DelimitedParseSpec

260 views
Skip to first unread message

Ben Vogan

unread,
May 24, 2016, 3:11:15 PM5/24/16
to Druid User
Hi all,

I am trying to get Druid up and running for the first time and I'm having trouble ingesting data via Tranquility-Kafka.  I've downloaded the Imply distribution 1.2.1 and following the Clustering documentation for setting up my cluster (I'm running 3 nodes to start - master, data and query servers).  My nodes are running CDH 5.7, CentOS 7.2, java 8 and nodejs 4.4.4.  Everything starts up, but anytime a message comes into my kafka queue I get an exception:

2016-05-24 18:46:49,399 [KafkaConsumer-0] ERROR c.m.tranquility.kafka.KafkaConsumer - Exception:

java.lang.IllegalArgumentException: Instantiation of [simple type, class io.druid.data.input.impl.DelimitedParseSpec] value failed: columns

        at com.fasterxml.jackson.databind.ObjectMapper._convert(ObjectMapper.java:2774) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.ObjectMapper.convertValue(ObjectMapper.java:2700) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at io.druid.segment.indexing.DataSchema.getParser(DataSchema.java:101) ~[io.druid.druid-server-0.9.0.jar:0.9.0]

        at com.metamx.tranquility.druid.DruidBeams$.fromConfigInternal(DruidBeams.scala:293) ~[io.druid.tranquility-core-0.8.0.jar:0.8.0]

        at com.metamx.tranquility.druid.DruidBeams$.fromConfig(DruidBeams.scala:199) ~[io.druid.tranquility-core-0.8.0.jar:0.8.0]

        at com.metamx.tranquility.kafka.KafkaBeamUtils$.createTranquilizer(KafkaBeamUtils.scala:40) ~[io.druid.tranquility-kafka-0.8.0.jar:0.8.0]

        at com.metamx.tranquility.kafka.KafkaBeamUtils.createTranquilizer(KafkaBeamUtils.scala) ~[io.druid.tranquility-kafka-0.8.0.jar:0.8.0]

        at com.metamx.tranquility.kafka.writer.TranquilityEventWriter.<init>(TranquilityEventWriter.java:64) ~[io.druid.tranquility-kafka-0.8.0.jar:0.8.0]

        at com.metamx.tranquility.kafka.writer.WriterController.createWriter(WriterController.java:171) ~[io.druid.tranquility-kafka-0.8.0.jar:0.8.0]

        at com.metamx.tranquility.kafka.writer.WriterController.getWriter(WriterController.java:98) ~[io.druid.tranquility-kafka-0.8.0.jar:0.8.0]

        at com.metamx.tranquility.kafka.KafkaConsumer$2.run(KafkaConsumer.java:231) ~[io.druid.tranquility-kafka-0.8.0.jar:0.8.0]

        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_91]

        at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_91]

        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_91]

        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_91]

        at java.lang.Thread.run(Thread.java:745) [na:1.8.0_91]

Caused by: com.fasterxml.jackson.databind.JsonMappingException: Instantiation of [simple type, class io.druid.data.input.impl.DelimitedParseSpec] value failed: columns

        at com.fasterxml.jackson.databind.deser.std.StdValueInstantiator.wrapException(StdValueInstantiator.java:405) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.deser.std.StdValueInstantiator.createFromObjectWith(StdValueInstantiator.java:234) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:167) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:398) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1064) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:264) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeOther(BeanDeserializer.java:156) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:126) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer._deserializeTypedUsingDefaultImpl(AsPropertyTypeDeserializer.java:129) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer.deserializeTypedFromObject(AsPropertyTypeDeserializer.java:92) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.deser.AbstractDeserializer.deserializeWithType(AbstractDeserializer.java:132) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.deser.SettableBeanProperty.deserialize(SettableBeanProperty.java:536) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:344) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1064) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:264) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeOther(BeanDeserializer.java:156) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:126) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer._deserializeTypedForId(AsPropertyTypeDeserializer.java:113) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer.deserializeTypedFromObject(AsPropertyTypeDeserializer.java:84) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.deser.AbstractDeserializer.deserializeWithType(AbstractDeserializer.java:132) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.deser.impl.TypeWrappedDeserializer.deserialize(TypeWrappedDeserializer.java:41) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.ObjectMapper._convert(ObjectMapper.java:2769) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        ... 15 common frames omitted

Caused by: java.lang.NullPointerException: columns

        at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:229) ~[com.google.guava.guava-16.0.1.jar:na]

        at io.druid.data.input.impl.DelimitedParseSpec.<init>(DelimitedParseSpec.java:52) ~[io.druid.druid-api-0.3.16.jar:0.3.16]

        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.8.0_91]

        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[na:1.8.0_91]

        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.8.0_91]

        at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[na:1.8.0_91]

        at com.fasterxml.jackson.databind.introspect.AnnotatedConstructor.call(AnnotatedConstructor.java:125) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        at com.fasterxml.jackson.databind.deser.std.StdValueInstantiator.createFromObjectWith(StdValueInstantiator.java:230) ~[com.fasterxml.jackson.core.jackson-databind-2.4.6.jar:2.4.6]

        ... 35 common frames omitted



My Kafka config:

{

  "dataSources" : {

    "bentest" : {

      "spec" : {

        "dataSchema" : {

          "dataSource" : "bentest",

          "parser" : {

            "type" : "string",

            "format" : "json",

            "parseSpec" : {

              "timestampSpec" : {

                "column" : "timestamp",

                "format" : "millis"

              },

              "flattenSpec" : {

                "useFieldDiscovery"true,

                "fields"[

                  {

                    "type" : "path",

                    "name" : "nested-child",

                    "expr" : "$.nested-root.nested-child"

                  }

                ]

              },

              "dimensionsSpec" : {

                "dimensions" : ["nested-child","afield"]

              }

            }

          },

          "granularitySpec" : {

            "type" : "uniform",

            "segmentGranularity" : "hour",

            "queryGranularity" : "none"

          },

          "metricsSpec" : [

            {

              "type" : "count",

              "name" : "count"

            }

          ]

        },

        "ioConfig" : {

          "type" : "realtime"

        },

        "tuningConfig" : {

          "type" : "realtime",

          "maxRowsInMemory" : "100000",

          "reportParseExceptions"true,

          "intermediatePersistPeriod" : "PT10M",

          "windowPeriod" : "PT1H",

          "rejectionPolicy" : {

            "type" : "none"

          }

       }

      },

      "properties" : {

        "task.partitions" : "1",

        "task.replicants" : "1",

        "topicPattern" : "bentest"

      }

    },

  },

  "properties" : {

    "zookeeper.connect" : "kafka001",

    "druid.discovery.curator.path" : "/druid/discovery",

    "druid.selectors.indexing.serviceName" : "druid/overlord",

    "commit.periodMillis" : "15000",

    "consumer.numThreads" : "1",

    "kafka.zookeeper.connect" : "kafka001",

    "kafka.group.id" : "tranquility-kafka",

    "kafka.auto.offset.reset" : "largest"

  }

}


Help would be greatly appreciated!


Thanks,

--Ben


Ben Vogan

unread,
May 24, 2016, 3:30:00 PM5/24/16
to druid...@googlegroups.com
My apologies.  As soon as I sent the message and looked over the tranquility kafka config I saw that the format=json was under the parser section instead of the parserSpec. 

--
You received this message because you are subscribed to a topic in the Google Groups "Druid User" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/druid-user/LsPHofLS_AQ/unsubscribe.
To unsubscribe from this group and all its topics, send an email to druid-user+...@googlegroups.com.
To post to this group, send email to druid...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/druid-user/f3fadbd1-15a8-4dc7-952d-4c31819906b1%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply all
Reply to author
Forward
0 new messages