Kafka Consumer origin stops consuming when a large message is encountered

806 views
Skip to first unread message

TJ Brown

unread,
Mar 16, 2017, 12:17:26 AM3/16/17
to sdc-user
Kafka consumers have a property (max.partition.fetch.bytes) to define the largest size of a message that it can consume.  When a Kafka Consumer origin encounters a message that exceeds the size specified in the max.partition.fetch.bytes property, the consumer stops consuming messages from the Kafka topic.  The Kafka Consumer origin, however, does not give any indication that it has encountered a message that it cannot consume.  It appears as if it has consumed all of the messages in the topic (even sending empty batches through the pipeline) when really there are still messages to be consumed.  My expectation would be that the origin would indicate in some way that it has encountered a message that it cannot consume and either stop the pipeline and give an error message or create an error record and continue consuming subsequent messages.  What is the intended behavior in this situation? 

Thanks,
T.J. Brown

Jeff Evans

unread,
Mar 16, 2017, 11:50:43 AM3/16/17
to TJ Brown, sdc-user
Hi TJ,

My understanding is that is the intended behavior of Kafka.  A consumer stops consuming messages if the next one is too large for it to handle (by way of this, or any of the similar configuration parameters).



--
You received this message because you are subscribed to the Google Groups "sdc-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to sdc-user+unsubscribe@streamsets.com.
Visit this group at https://groups.google.com/a/streamsets.com/group/sdc-user/.



--

TJ Brown

unread,
Mar 21, 2017, 11:52:43 AM3/21/17
to sdc-user, tjb...@gmail.com
Thanks for the explanation.   
To unsubscribe from this group and stop receiving emails from it, send an email to sdc-user+u...@streamsets.com.



--
Reply all
Reply to author
Forward
0 new messages