Consuming Kafka Message with Retry Strategy and backpressure

18 views
Skip to first unread message

Pavel Kokoshnikov

unread,
Jan 16, 2024, 6:38:24 AM1/16/24
to SmallRye

Hello SmallRye Team,

I am currently working on a project involving Kafka, and I've encountered a challenge that I believe your expertise can help with. In our application, we need to implement a mechanism to endlessly retry Kafka messages when facing specific exceptions, like database downtimes or receiving 500 errors from a service. This issue could potentially last for extended periods, and it's critical to manage our memory usage to avoid exceeding limits. Therefore, we need to ensure that the number of messages being processed at any given time is capped.

Could you advise on how we can achieve this within the SmallRye framework? Your insights on this matter would be highly valuable.

Thank you for your assistance!

Pavel Kokoshnikov

unread,
Jan 17, 2024, 2:15:06 AM1/17/24
to SmallRye
I found simple solution.

1. @Retry(maxRetries = -1) - endless retry
2. KafkaRecordStreamSubscription.java - pauses polling, if records queue more than max.poll.records * max-queue-size-factor. Rebalance and infinity memory consumption is not a problem here.

вторник, 16 января 2024 г. в 14:38:24 UTC+3, Pavel Kokoshnikov:
Reply all
Reply to author
Forward
0 new messages