RabbitMQ handling large messages + burst on a slow consumer

492 views
Skip to first unread message

Sau

unread,
Oct 2, 2014, 9:00:05 PM10/2/14
to rabbitm...@googlegroups.com
Hi All,

I am new to messaging and particularly RabbitMQ. I am trying to build a solution where a large message burst ( JSON message size of 200 kb and 6000msg/sec) with 24 queues needs to be consumed by slow consumers / queue that are going to deserialize the message and persist it in database (postgresql). 

Option 1: Run the RabbitMQ broker with a RAM of 16-32 G so that it can handle burst.
Option 2:  Multiple Threads consuming the message and each thread doing the de-serialization and persistence. This will help in queue not building up but will put a load on DB,.

Please let me know if there is a better solution available.

Thanks
Sau

Michael Klishin

unread,
Oct 3, 2014, 3:21:06 AM10/3/14
to rabbitm...@googlegroups.com, Sau
You'll have to find a balance point between options 1 and 2 then.

If it's easier to just give RabbitMQ more RAM (or have more cluster nodes
and distribute queues between them by declaring them on different nodes)
than make the database handle concurrent writes, do it. But the two are
not mutually exclusive. 
--
MK

Staff Software Engineer, Pivotal/RabbitMQ
Reply all
Reply to author
Forward
0 new messages