py4web using the Scheduler - Celery

524 views
Skip to first unread message

Chris

unread,
May 25, 2021, 11:19:50 AM5/25/21
to py4web
Hi, i’ve created a small tutorial to configure and use celery in py4web. Using celery you can do periodic tasks or execute asynchronous task for those long running process without block other requests.

here is the tutorial:


Cheers.
Chris.

--
Sent from Canary

Alexander Beskopilny

unread,
May 25, 2021, 12:56:01 PM5/25/21
to py4web
https://github.com/ali96343/capp
my examples  py4web + celery + redis    
and port  https://github.com/miguelgrinberg/flask-celery-example
to py4web

Massimo

unread,
May 30, 2021, 3:56:21 AM5/30/21
to py4web
This is great. Would you mind linking it from the documentation?

Kevin Keller

unread,
May 30, 2021, 4:33:22 AM5/30/21
to Massimo, py4web
I am currently gathering what needs to be documented or documented better for a documentarion sprint. 

I will add this and the other things that need to be documented to the list.

Let me share the list with you once I am back my desk.



--
You received this message because you are subscribed to the Google Groups "py4web" group.
To unsubscribe from this group and stop receiving emails from it, send an email to py4web+un...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/py4web/dbf03b8f-c3fc-4680-a77a-7b9f3ccda9c9n%40googlegroups.com.

Massimo DiPierro

unread,
May 30, 2021, 4:47:14 AM5/30/21
to Kevin Keller, py4web
Thank you Kevin.

Andrew Rogers

unread,
Aug 17, 2021, 7:49:17 PM8/17/21
to py4web
Thanks for the document Chris. It was very helpful. The web2py scheduler was a lot easier  for a novice to setup and use. And I could manipulate tasks and whatnot through the database which made it really easy for me to see what was going. I am guessing though that the new approach is more robust and scalable. A few comments that other people running py4web on Windows or coming from web2py might find helpful. Oh, and this is the first time I have used redis, celery and py4web so take these comments with plenty of salt.

1. You need to have Celery and a Redis server running - these are all running separate to py4web.
2. If you are running py4web on Windows there are additional challenges
3. The doc above recommends running "celery -A apps.YourApp.tasks worker --loglevel=info". However, although this starts on Windows, it throws errors when the task runs. I needed to add "--pool=solo" to get it working. See https://stackoverflow.com/a/63010696/2193698
4. There doesn't seem to be any current redis-server version for Windows. The only ones I could find were very old. This link describes how I ended up getting it running  https://redis.com/blog/redis-on-windows-10/  Basically you install Linux on Windows directly from the Windows Store (which was very easy) and install redis-server (which was also very easy). Only took about 10mins.
5. Don't forget to change the name of the task in the "scheduler.conf.beat_schedule" section. I think this was missed in the document above.


Andrew Rogers

unread,
Aug 17, 2021, 7:52:59 PM8/17/21
to py4web
By the way, web2py gets a call out on the celeryproject page. Maybe py4web needs to get it's name in there?


Capture.PNG

Andrew Rogers

unread,
Aug 17, 2021, 8:18:13 PM8/17/21
to py4web
And it seems like Celery doesn't need to use a redis-server:

Celery requires a message transport to send and receive messages. The RabbitMQ and Redis broker transports are feature complete, but there’s also support for a myriad of other experimental solutions, including using SQLite for local development.

Maybe the default install could have the celery broker set to sqlite - especially given that py4web already defaults to a local sqlite db. 

Andrew Rogers

unread,
Aug 18, 2021, 12:57:10 AM8/18/21
to py4web
Oh, and the doc didn't explain that for the scheduler (beat) part of celery to work you need to include the -B switch. But that doesn't work on Windows so you need to launch celery a 'separate service' to handle scheduled tasks:

celery -A apps.your_app.tasks beat

Jim Steil

unread,
Oct 12, 2021, 4:33:21 PM10/12/21
to py4web
Thanks for this great write-up on using celery.

I've got it all working and everything is great if I call celery from the command line in the py4web root directory.  However, if I call it from a different directory, or if I try to daemonize it, I can't get it to run. It tells me my app is not valid.

Has anyone been able to get celery running as a daemon with apps.your_app.tasks as the application?

I'm trying to configure with systemd and am not having any luck.

-Jim

Chris

unread,
Oct 12, 2021, 4:59:27 PM10/12/21
to Jim Steil, py4web
Hi Jim, 

i have celery running as daemon with supervisor in a debian system, 

apt-get install supervisor 

Config:

/etc/supervisor/conf.d/celery.conf

[program:celery]
command=celery -A apps.MyApp.tasks worker --loglevel=info
directory=/somepath/py4web/
autostart=true
autorestart=true
stderr_logfile=/var/log/celery.err.log
stdout_logfile=/var/log/celery.out.log

Hope this help

Cheers.
Chris.

Alexander Beskopilny

unread,
Oct 13, 2021, 2:38:00 AM10/13/21
to py4web
To load celery + uvicorn a loader was written,
The loader calculates the necessary directories and
can stop-restart celery and uvicorn

loader
(transport - socketio+redis)

in3, in4 - UI updaters with multiple celery-beats-workers + pydal + sse-starlette + uvicor+ socketio

Applications  (in3, in4)  use different ports (3000, 5000) and work simultaneously

Jim Steil

unread,
Oct 13, 2021, 9:36:28 AM10/13/21
to py4web
Thanks Alexander

I took a look at that yesterday but it confused me even more.  What I'm looking for is the magic sauce to get celery working as a systemd daemon. I'm guessing I'm just having permission issues or something in the config file, but can't seem to find what I'm doing wrong.

-Jim

Jim Steil

unread,
Oct 13, 2021, 10:08:19 AM10/13/21
to py4web
FYI - just got it all working. I was permission issues, I just had to step back and look at it again. I'm pretty much using the default systemd configuration found here -> https://docs.celeryproject.org/en/stable/userguide/daemonizing.html#usage-systemd

I just had to modify the user/group and then fix permissions on the log files.  It says right in the docs to update permissions on the log files but somehow I missed it.

I finally found messages in /var/log/syslog telling me that permission issues on the log files were the problem.

-Jim

AvE

unread,
May 6, 2024, 9:25:04 AM5/6/24
to py4web
FYI
I was trying to get it to work with docker and got some startup problems but got is working after some minor changes:
I am using a separate redis container: 
docker-compose: 
  .....
  py4web etc.
  .....
  redis:
    restart: always
    image: redis
    ports:
      - "6379:6379"

based on _scaffold: 
in common  I add:  
# #######################################################
# Optionally configure celery
# #######################################################
if settings.USE_CELERY:
    from celery import Celery

    # to use "from .common import scheduler" and then use it according
    # to celery docs, examples in tasks.py
    scheduler = Celery(
        "apps.%s.tasks" % settings.APP_NAME, broker=settings.CELERY_BROKER, backend='redis://redis:6379/0')
    scheduler.conf.broker_connection_retry_on_startup = True  #added this to get rid of  CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine 

in settings:
# Celery settings
USE_CELERY = True
CELERY_BROKER = "redis://redis:6379/0"

my example task:
import time
@scheduler.task()
def MyLongRunningTask(arg1, arg2):
    time.sleep(30)
    return 'Task finished! Args: {} – {}'.format(arg1, arg2)
   
mydocker entrypoint.sh to get the scheduler going and start py4web
#!/bin/bash
. /home/py4web/.venv/bin/activate
exec .venv/bin/celery -A apps.myapp.tasks beat &
exec .venv/bin/celery -A apps.myapp.tasks worker --loglevel=info &
exec py4web run --password_file password.txt --host 0.0.0.0 --port 8000 apps
 
controller:
To start the task:
#import the tasks module and the task
from .tasks import MyLongRunningTask
@action('MyLongRunningTask', method=['GET'])
@action.uses(db, session )
def long_running_task():
    results = MyLongRunningTask.delay('Hello', 'world!')
    print('Task id: {}'.format(results.id))
    print('Task status: {}'.format(results.status))
    return dict(id=results.id , status=results.status)

To get the status of the task:   
from .common import scheduler           # instead of from celery.result import AsyncResult
@action('status/<id>', method=['GET'])
@action.uses(db, session )
def status(id=None):
    res = scheduler.AsyncResult(id)       # instead of  AsyncResult(id)
    if res.status == 'SUCCESS':
        return res.status, res.get()
        print(res.get())
    return res.status


Hope this can help some other beginners,
Chears, Adri




Op woensdag 13 oktober 2021 om 16:08:19 UTC+2 schreef ato....@gmail.com:

Massimo

unread,
May 10, 2024, 1:33:28 AM5/10/24
to py4web
This is nice. Would be nice if you could make a PR to the docs with these instructions :-)
Reply all
Reply to author
Forward
0 new messages