Hi Jon
I have a RabbitMQ broker which has two queues in it. RabbitMQ has options for persistent storage on disk so you can restart it and not worry about missing messages downstream in the database as long as ACKs are set up properly.
I use Python to connect to Kafka, then push to both RabbitMQ queues, then send an ACK to Kafka. Then I can pull from the RabbitMQ queues from prod and development without any issues.
For the TD feed from Network Rail, I actually have two prod feeds, one for my database and one picked up by my WebSocket server.
There might be more elegant ways without Python in the middle, but this is better than the jank I first made with just Python queues and RPyC to consume messages!