* Shared redis instance does not seem to work well, unless you use
different db numbers.
* I have not tested multiple celerys with one rabbitmq but should be
possible with some routing configuration.
* DB backend works ok and is easy to setup per project (no extra setup
or daemon to run apart from celeryd), this backend is the least
performing.
If all you projects are sharing tasks you may even run celeryd on its
own and configure all apps to send, that tasks it works ok, I have one
celeryd instance accepting shared tasks from many apps using redis
backend (rabbitmq should work as well) and seems to scale ok.
If you are managing multiple celery instances you may want to give a
look at supervisord:
Regards,
Carlos Daniel Ruvalcaba Valenzuela
--
You received this message because you are subscribed to the Google Groups "Django users" group.
To post to this group, send email to django...@googlegroups.com.
To unsubscribe from this group, send email to django-users...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/django-users?hl=en.
:P
Second Arun’s suggestion on gearman. Pretty solid too.
Looking for an alternative to celery? — Have you considered carrot?
:P
--
You received this message because you are subscribed to the Google Groups "Django users" group.
Gearman also would require a separate process for each project that
you have operating in it.
If you think about it, that is a bit of a non-requirement, since every
distributed task queue will require a daemon process to attach to it
to receive work, and if they are different projects, they will clearly
require separate processes.
With celery/rabbitmq, you can run as many django projects, with as
many celeryd instances per project, as you want. You don't need a 1:1
mapping between projects and celeryd instances, we run many celeryd
instances for a single project (across many boxes). Each project
should use a separate vhost on rabbitmq, which you configure with
settings.BROKER_VHOST.
Cheers
Tom