Looking for best option for Pyramid deployment

20 views
Skip to first unread message

Thierry Florac

unread,
Feb 20, 2022, 8:53:02 AMFeb 20
to pylons-...@googlegroups.com
Hi,
I've built a custom content management framework based on Pyramid; it's a classic web application, based on ZODB and RelStorage with a PostgreSQL backend, a Redis cache and an Elasticsearch index, and I'm actually looking for the best production deployment option.
Until now, I always used Apache with mod_wsgi, and I'm actually quite happy with it, but I'd like to know if there is a better option with better performance!
In development mode, I already tried using "native" Waitress server, a GUnicorn server, and I just tried using Pypy with Waitress or GUnicorn but performances were quite "disappointing"!

Any advice?

Best regards,
Thierry

Mikko Ohtamaa

unread,
Feb 20, 2022, 11:28:41 AMFeb 20
to pylons-...@googlegroups.com
Hi Thierry,


I've built a custom content management framework based on Pyramid; it's a classic web application, based on ZODB and RelStorage with a PostgreSQL backend, a Redis cache and an Elasticsearch index, and I'm actually looking for the best production deployment option.
Until now, I always used Apache with mod_wsgi, and I'm actually quite happy with it, but I'd like to know if there is a better option with better performance!
In development mode, I already tried using "native" Waitress server, a GUnicorn server, and I just tried using Pypy with Waitress or GUnicorn but performances were quite "disappointing"!

I have been running Pyramid behind Apache, Nginx reverse proxies with mod_wsgi, Uwsgi and plenty of other options. I am currently running Cloudflare + Caddy + Waitress combination.

- Cloudflare is the easiest CDN for static media files, making JS and CSS load an order of magnitude faster if they were served from my web server
- Caddy is way simpler to configure than Apache or Nginx, especially want comes to HTTPS, but still quite performant
- Waitress is good enough and it is headache-free, unlike uWSGI or mod_wsgi
- 99% times the performance penalty does not depend on the web server components, but the Python application itself. There is no meaningful performance differences between web servers until you start to do dozens of requests per second.

The architecture I am running is overviewed here: https://tradingstrategy.ai/blog/building-cryptocurrency-website

Here is the Caddy configuration for Waitress reverse proxy: https://github.com/tradingstrategy-ai/proxy-server/blob/master/Caddyfile#L29

Best regards,
Mikko


 

Jonathan Vanasco

unread,
Mar 15, 2022, 1:28:12 PMMar 15
to pylons-discuss
We run Pyramid via uwsgi, behind an OpenResty server (Nginx fork with embedded lua interpreter).  

IMHO the "best" stack is somewhat dependent on your application.  We have a large application and the forking model of uwsgi - combined with many of its management hooks - lets us aggressively leverage a lot of efficiencies via the copy-on-write memory behavior.  gunicorn was not nearly as performant for us with our core app, however it has been more performant on other apps.  We've run some smaller apps via waitress as a test, and forgot about it for months - it worked perfectly with no complaints.

Caddy is certainly a decent server - and has some of the best https termination and ssl certificate management in the field.  It has a built in ACME client, and can store provisioned certificates in the cloud, which simplifies a lot of devops in clustered environments.

I stay away from Apache and personally don't recommend it.  Nginx (and caddy) is far better with concurrent requests and throughput; they also have much lighter memory footprints. We can run a lot more uwsgi processes behind Nginx than any Apache deployment option - which means we can scale to more processes on a node before having to scale onto more nodes in a cluster.  In my experience, from a devops management and billing perspective, Apache tends to be much more expensive.
Reply all
Reply to author
Forward
0 new messages