Python: How to run multiple consumers to RabbitMQ queue using Pika

1,026 views
Skip to first unread message

deepak kumar

unread,
Jun 8, 2018, 2:50:10 AM6/8/18
to rabbitmq-users
In the example mentioned on https://www.rabbitmq.com/tutorials/tutorial-two-python.html we have to manually run multiple workers in individual command prompt windows. Is there a way to to do it in more sophisticated way which require less human intervention. Let's say you have to run 100 workers at a time. You can not go with this approach. Please suggest

Arnaud Cogoluègnes

unread,
Jun 8, 2018, 5:08:06 AM6/8/18
to rabbitm...@googlegroups.com
You can run several consumers in the same process, the number could be a program argument.

On Fri, Jun 8, 2018 at 8:50 AM, deepak kumar <deep...@gmail.com> wrote:
In the example mentioned on https://www.rabbitmq.com/tutorials/tutorial-two-python.html we have to manually run multiple workers in individual command prompt windows. Is there a way to to do it in more sophisticated way which require less human intervention. Let's say you have to run 100 workers at a time. You can not go with this approach. Please suggest

--
You received this message because you are subscribed to the Google Groups "rabbitmq-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to rabbitmq-users+unsubscribe@googlegroups.com.
To post to this group, send email to rabbitmq-users@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Luke Bakken

unread,
Jun 8, 2018, 6:21:21 PM6/8/18
to rabbitmq-users
Hi Deepak,

This is a general programming question, not specific to Pika or Python.

You have several options available, including forking subprocesses and using threads. Please note you must take into consideration that Pika is not multiprocess or thread-safe when coding your application. The simplest option is to use a connection and channels per thread / worker.

Thanks,
Luke

Sumant Gedam

unread,
Jan 10, 2024, 2:10:11 AM1/10/24
to rabbitmq-users
Hey Luke,
As you have suggested that we should use per connection per thread/worker strategi. Can we apply the same if our worker is running previous to multiprocessing?
In my case, i have a REST api implemented as soon as i trigger the URL it publishes the message for consumer to fetch(using channel.basic_publish). In my worker.py(consumer script) i initialize multiprocessing for calculation wherein it doesn't work and i get: ( [*] Waiting for messages. To exit press CTRL+C [*] Waiting for messages. To exit press CTRL+C ) multiple times, i am not sure if the approach is correct and even thinking of implementing rabbitmq on docker rather than python env.
Thanks,
Sumant

Luke Bakken

unread,
Jan 10, 2024, 10:41:48 AM1/10/24
to rabbitmq-users
Hi Sumant,

I can't assist you based on a vague description of what you're trying. If you'd like my assistance, do the following:
  • Make your code available via a git repository that I can clone and run. It must be easily runnable - I do not have time to figure out how to set it up.
  • Start a discussion here with a better description of what you're trying, and link to your code - https://github.com/pika/pika/discussions/

Reply all
Reply to author
Forward
0 new messages