Kafka Connect - ElasticSearch JsonDeserializer error

672 views
Skip to first unread message

Anurag Phadke

unread,
Oct 16, 2016, 4:46:40 PM10/16/16
to Confluent Platform
Hello,
I am using Kafka Connect + ElasticSearch, the ES sink is now getting created and I can see it in kafka-connect logs. However, whenever I send any data to the topic using:
docker run --net=host --rm confluentinc/cp-kafka:3.0.1 bash -c "echo '{"foo":"bar"}' | kafka-console-producer --request-required-acks 1 --broker-list localhost:29092 --topic foo && echo 'done.'"

 kafka-connect logs throw the following error:

[2016-10-16 20:36:29,980] ERROR Task elasticsearch-sink-0 threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask)
org.apache.kafka.connect.errors.DataException: JsonDeserializer with schemas.enable requires "schema" and "payload" fields and may not contain additional fields
at org.apache.kafka.connect.json.JsonConverter.toConnectData(JsonConverter.java:332)
at org.apache.kafka.connect.runtime.WorkerSinkTask.convertMessages(WorkerSinkTask.java:356)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:226)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:170)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:142)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:140)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:175)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.
java:745)


This makes me believe that the data that I am sending to a topic isn't in the format the connect expects. Full gist here: https://gist.github.com/anuragphadke/4ea11d5652b9088eccf792ffe5feac52

Any pointers?
-anurag

Shikhar Bhushan

unread,
Oct 17, 2016, 2:30:53 PM10/17/16
to Confluent Platform
Hi Anurag,

It seems you are working with schemaless JSON data. You'll need this in your Connect worker configuration (< 0.10.1) or connector configuration (0.10.1+):
    key.converter.schemas.enable=false
    value.converter.schemas.enable=false
(the above config is hinted at by the exception and we should have better docs around this soon)

And in your ES connector configuration:
    schema.ignore=true
(refer to ES connector docs for that opt)

Best,

Shikhar

--
You received this message because you are subscribed to the Google Groups "Confluent Platform" group.
To unsubscribe from this group and stop receiving emails from it, send an email to confluent-platf...@googlegroups.com.
To post to this group, send email to confluent...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/confluent-platform/380d3967-a221-4a2d-b931-53bdd48182e0%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Anurag

unread,
Oct 17, 2016, 2:32:07 PM10/17/16
to confluent...@googlegroups.com
Thanks Shikhar - adding the false options did the trick..
i'll write a detailed blog post soon..

-anurag


To unsubscribe from this group and stop receiving emails from it, send an email to confluent-platform+unsub...@googlegroups.com.
To post to this group, send email to confluent-platform@googlegroups.com.

--
You received this message because you are subscribed to the Google Groups "Confluent Platform" group.
To unsubscribe from this group and stop receiving emails from it, send an email to confluent-platform+unsub...@googlegroups.com.
To post to this group, send email to confluent-platform@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/confluent-platform/CAPXNJrXnehvcagcEyL5v41jceHj54mRAoLi5n9WHy8o3UtMv-g%40mail.gmail.com.

For more options, visit https://groups.google.com/d/optout.



--
Twitter: @anuragphadke (https://twitter.com/#!/anuragphadke)
Reply all
Reply to author
Forward
0 new messages