I'd like to use a light weight dispatching system for a web server.
Basically some django post requests might require processing, which
should be done in the back ground due to its run times.
The results would be added to the django data base.
The browser could verify via AJAX requests whether the task is finished.
The server would be running on a rather weak virtual machine with rather
low memory (nginx / uwsgi / django )
( For testing I run the server on windows with one of the following
setups (depending on what I'd like to test)
- django runserver
- twisted - django
- cygwin/nginx - fastcgi - django
Most people seem to recommend celery with RabbitMQ.
If I understood well, rabbit MQ requires Erlang to be installed and I
found some posts indicating that RabbitMQ requires quite some memory
So I wondered whether celery / RabbitMq wouldn't be a little on the
heavy side and eat away a little too much from my meory.
Is there any good light weight dispatching alternative to celery or
would this be one of these 'roll your own dispatcher' tasks?
On my personal, doesn't-really-do-much, 5k messages a day, home
server, rabbitmq uses a grand total of 19 MB RAM.
On one of our production servers handling millions of messages a day,
rabbitmq uses a total of 27 MB of RAM.
I guess it all depends on your definition of "too much". I doubt a
home-brew python process would be as slender.
Cheers
Tom
Check the celery page for alternative brokers. I have tested DB broker
and works ok, redis broker worked wonders in another setup where it
was also being used for caching.
Regards,
Carlos Daniel Ruvalcaba Valenzuela
--
You received this message because you are subscribed to the Google Groups "Django users" group.
To post to this group, send email to django...@googlegroups.com.
To unsubscribe from this group, send email to django-users...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/django-users?hl=en.
> Hi,
>
> I'd like to use a light weight dispatching system for a web server.
>
> Basically some django post requests might require processing, which
> should be done in the back ground due to its run times.
>
> The results would be added to the django data base.
>
> The browser could verify via AJAX requests whether the task is finished.
>
>
> The server would be running on a rather weak virtual machine with rather
> low memory (nginx / uwsgi / django )
>
> ( For testing I run the server on windows with one of the following
> setups (depending on what I'd like to test)
> - django runserver
> - twisted - django
> - cygwin/nginx - fastcgi - django
>
>
>
Celery is very good, but as you will run uWSGI in production you can look at
http://projects.unbit.it/uwsgi/wiki/Spooler
and its abstraction:
http://projects.unbit.it/uwsgi/wiki/Decorators#spool
(check https://github.com/jaysonsantos/django-uwsgi-mail for a real-world usage)
If you want to go lower-level, check for mules:
http://projects.unbit.it/uwsgi/wiki/Mules
Even django-zeromq (as already suggested by someone) is very good.
Another solution is using python thread queues:
http://projects.unbit.it/uwsgi/wiki/Example#threadqueue
--
Roberto De Ioris
http://unbit.it
JID: rob...@jabber.unbit.it
To summarize quickly
RabbitMQ doe snot seem to be as greedy as I expected (probably around
19MB for my expected load)
Django-ztask is a small brokerless solution
Celery supports multiple brokers with difference performance. the dd
broker would not require an additional process (apart from celery)
uwsgi has a spooler which might be what I'm looking for I just would
like to execute some tasks, which are too slow to be treated directly
within an HTTP request sequentially one after the other.
However if I used uwsgi I had to look for an alternative implementation
and a small wrapper such, that the system would still be working on a
windows host without uwsgi. (performance on windows is not crucial, but
it should work)
Now I just need some time to test some of these options on windows PC
and on my tiny virtual linux host.
Brian Schott
bfsc...@gmail.com