[kafka-client] NetworkClient.poll does not throw exception but just log it ?

335 views
Skip to first unread message

Hao Ren

unread,
Jun 21, 2016, 12:33:38 PM6/21/16
to Confluent Platform
Hi,

New to kafka and confluent platform.

Actually, I am using confluent 2.0.0 and kafka-client 0.9.0.1.

I create a consumer to read avro message from kafka.

I have also provided the schema registry url to AvroDeserializer for deserialization.

One test case is to catch some error/exception during message deserialization if the format of the message pushed to the correspondant topic is not correct.

I managed to send a malformed message "abc" to my topic by using console-producer which does not check the schema.

When I started to consumer messages, it seems that there is no exception is thrown, but only some error logs show up again and again during the poll duration.



for (ClientResponse response : responses) {
if (response.request().hasCallback()) {
try {
response.request().callback().onComplete(response);
} catch (Exception e) {
log.error("Uncaught error in request completion:", e);
}
}
}


What I observed is that java.nio.BufferUnderflowException occurs in `onComplete` call, then error log  is shown periodically.

I am trying to understand this code snippet, since it seems that even if there is an exception, it will just be logged as error in order not to stop the consumer.
But why not throwing the exception and let user to process it. In my case, I want to catch the BufferUnderflowException during deserialization for testing.
Actually, it seems not possible.

Any help is highly appreciated.

Thank you

Hao

Hao Ren

unread,
Jun 26, 2016, 5:46:40 PM6/26/16
to confluent...@googlegroups.com
Anyone can check this please ?
If the problem is not clear, I will explain it in detail.
Thank you.

--
You received this message because you are subscribed to a topic in the Google Groups "Confluent Platform" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/confluent-platform/sNclvgK1tGY/unsubscribe.
To unsubscribe from this group and all its topics, send an email to confluent-platf...@googlegroups.com.
To post to this group, send email to confluent...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/confluent-platform/b3cc2464-8f04-4a81-bf13-07b03c1aa46d%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.



--
Hao Ren

Data Engineer @ leboncoin

Paris, France

Jason Gustafson

unread,
Jul 20, 2016, 12:06:35 AM7/20/16
to Confluent Platform
Hey Hao,

Ouch! This bug appears to go back surprisingly far in the development of the new consumer. Ironically, I have been working this past week on a patch which (unintentionally) fixes the problem by moving the message parsing out of the response handler. It would have looked much less embarrassing if you had waited another week! The link is here: https://github.com/apache/kafka/pull/1627. I'll open a JIRA and make sure this gets fixed in 0.9.0 and 0.10.0 as well. Unfortunately, there may not be a great workaround while you await the fix. The best option that comes to mind is to wrap the deserializer and raise an instance of Error (not Exception), which isn't caught by that handler.

-Jason
Reply all
Reply to author
Forward
0 new messages