Decoupling Celery workers from backend server

41 views
Skip to first unread message

onlinejudge95

unread,
Mar 10, 2024, 11:45:58 AMMar 10
to celery-users
Hi Everyone,

I have the following use case.
I use docker swarm for deploying my FastAPI backend along with a celery worker. I am using Redis for the Backend and Broker. As of now, my celery worker is using the same docker image as my backend server which makes my server image a bit heavy, as it contains libraries for compute-intensive tasks which are done by the celery worker. I want to decouple my celery worker and my server. I tried implementing the following solution where I connect to the same Redis store from my decoupled-worker which has a simple task that prints the arguments passed to the task method. I trigger the task in my decoupled worker from my server using the following bit of code

```
celery = Celery(
    "backend_tasks",
    backend=str(conf.CELERY_BACKEND_URI),
    broker=str(conf.CELERY_BROKER_URI),
    include=["src.task.task"],
)
celery.send_task("cache_sport", args=("sport", [encoders.jsonable_encoder(sports)]))
```
In my decoupled worker I have defined the celery task like this
```
celery = Celery(
    "worker_tasks",
    backend=str(conf.CELERY_BACKEND_URI),
    broker=str(conf.CELERY_BROKER_URI),
    include=["src.task.task"],
)
@celery.task(name="cache_sport")
def cache_sport(key: str, value: typing.List[typing.Any]):
    print(f"{key = }, {value = }")
```
Upon running the server logic, my decoupled worker does not pick up the task instead it is routed to my old worker which fails, as it does not have the task `cach_sports` registered. Any pointers on how to debug this?
Reply all
Reply to author
Forward
0 new messages