Hi,
till now I was very pleased with celery. I use Celery to distribute task on my compute server. I started with ~10k tasks which worked fine. Now I moved to ~100k tasks and after some time RabbitMQ crashes with following error message:
eheap_alloc: Cannot allocate 1098556536 bytes of memory (of type "heap").
Aborted (core dumped)
I guess celery is creating too many queues, but I don't know which setting I sent wrongly.
This is my celeryconfig.py
from utilities import get_number_cpu_cores
# List of modules to import when celery starts.
CELERY_IMPORTS = (...)
BROKER_URL = 'amqp://guest:guest@localhost:5672//'
CELERY_RESULT_BACKEND = "amqp"
CELERY_TASK_SERIALIZER = 'json'
CELERY_ACCEPT_CONTENT = ['json'] # Ignore other content
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Europe/Berlin'
CELERY_ENABLE_UTC = True
CELERYD_CONCURRENCY = get_number_cpu_cores() * 2
I tried to log what RabbitMQ is doing with
This is the last entry of vhosts.log:
Server: MochiWeb/1.1 WebMachine/1.10.0 (never breaks eye contact)
Date: Wed, 11 Jun 2014 19:03:34 GMT
Content-Type: application/json
Content-Length: 627
Cache-Control: no-cache
[{"message_stats":{"ack":59994,"ack_details":{"rate":0.0},"deliver":59994,"deliver_details":{"rate":0.0},"deliver_get":121603,"deliver_get_details":{"rate":0.0},"deliver_no_ack":1630,"deliver_no_ack_details":{"rate":0.0},"get":59979,"get_details":{"rate":0.0},"publish":118167,"publish_details":{"rate":0.0}},"messages":69099,"messages_details":{"rate":0.0},"messages_ready":69099,"messages_ready_details":{"rate":0.0},"messages_unacknowledged":0,"messages_unacknowledged_details":{"rate":0.0},"recv_oct":118595149,"recv_oct_details":{"rate":0.0},"send_oct":91945666,"send_oct_details":{"rate":0.0},"name":"/","tracing":false}]
The last entry of queues.log is cut off and the one before is 64MB.
Tell me if you need more information!
Thanks!