Hi all,
I'm looking for a setup for the following problem:
An external application calls our API with an id of some sort.
That id is used to call another external API, which results in a list of other ids. A jobId is returned to the caller.
Those ids are fed one by one or as a list to another external API, and again a list of ids are returned.
This is repeated for a third system and these results are returned with the jobId given earlier.
We can't predict how long each API will take, so some form of backpressure is required in order not to overload our and the external systems. While I could argue that this would be a Kafka flow, we can't use that (yet).
I image an InboundBuffer connected to/used by a streaming reader implementation that reads the buffer and calls an API request, and messages the result to the next step.
Is there something build in for this already? It's pretty hard to figure out concrete steps to take based on the code and the examples.
Is there a processor handler in the streams flow, that kinda 'maps' the input to the result of the extenal API call and sends that to a writer?
Hope the problem is understandable, can't imagine others haven't solved this before :)
TIA
Ronald