Should I modify response.headers in order to get nginx's uwsgi_cache work properly?

156 views
Skip to first unread message

Lisandro

unread,
Mar 24, 2017, 9:00:06 AM3/24/17
to web2py-users
I'm running a web2py website with public articles, and there are ocasional peaks in traffic. 
I use Nginx as a webserver, and uWsgi to run my web2py application.

Considering the articles are public (the HTML page of an article is the same for every visitor), I'm already doing some caching in order to improve performance (I'm using @cache.action decorator with Redis model).
However, and please correct me if I'm wrong, for every request made to the URL of an article, before the caching can be done, the models need to be executed.
So I thought I could improve performance even more, caching the HTML directly from Nginx, that way I would save resources in my server.

However I'm having a hard time getting it, and I wanted to know if I should modify the response.headers.
I've read that they come set by default:

To do some tests, I have this simple web2py function:

def test():
   
from datetime import datetime

   
return datetime.now().strftime('%H:%M:%S')



In the ngin'x side, the server block configuration is this:

uwsgi_cache_path /tmp/nginx_cache/ levels=1:2 keys_zone=mycache:10m max_size=10g inactive=10m use_temp_path=off;

server {
    ...

    location / {
        # response header to check if cache is a HIT or a MISS
        add_header              X-uWSGI-Cache $upstream_cache_status;

        # server cache
        uwsgi_cache  mycache;
        uwsgi_cache_valid  15m;
        uwsgi_cache_key  $request_uri;

        # client cache
        expires 3m;

        uwsgi_pass      unix:///tmp/web2py.socket;
        include         uwsgi_params;
        uwsgi_param     UWSGI_SCHEME $scheme;
    }
}


But every time I hit the test page, I check the response headers and I see always a MISS.
In other words, nginx still sends the requests to uwsgi, and the page is generated in every request.
I've found this forum post where someone says this:

"...it looks to me like the issue is that the upstream server is just not sending response that contain an expiration date (Expires:) or a cache validator (for instance, Last-Modified:). (The cookie expiration time has nothing to do with caching.)
The HTTP 1.1 spec says: 'If there is neither a cache validator nor an explicit expiration time associated with a response, we do not expect it to be cached, but certain caches MAY violate this expectation (for example, when little or no network connectivity is available).'"


So I thought I would still needed to use the @cache.action decorator (with model=None in order to only set response headers to allow client caching):

@cache.action(time_expire=222, cache_model=None, session=False, vars=False, public=True)
def test():
   
from datetime import datetime

   
return datetime.now().strftime('%H:%M:%S')


However I sill can't get it to work.
I set up time_expire=222 to check if the directive "expires 3m;" in nginx's configuration would overwrite it, and yes it does, the responses have a Cache-Control: max-age=180 (that is 3 minutes, not 222 seconds).

I don't intend to talk about nginx's configuration variables, but I'm tempted to ask: am I missing something on the web2py's side? 
Do I need to modify response.headers in another way to let nginx cache the response from uwsgi?


Lisandro

unread,
Apr 17, 2017, 8:15:21 AM4/17/17
to web2py-users
I've been dealing with this problem for some time now.
Some weeks ago I posted a question in stackoverflow, but I didn't find a solution yet:

Also I've been working with a sysop who knows better than me, but still couldn't solve the problem. 

I don't think web2py has anything to do with it, so I'll mark this thread as "no action required".
Still, any comment or suggestion will be appreciated.

Best regards,
Lisandro

Niphlod

unread,
Apr 19, 2017, 8:28:12 AM4/19/17
to web2py-users
It'd be easier to see if it works with a slightly modified directive for uwsgi_cache_valid

uwsgi_cache_valid any      1m

Because if you miss the 3 parameter notation, only 200 responses are cached.

Then you can try returning a proper X-Accel-Expires that should "trump" any other header.

uwsgi_cache is an application cache (read: you could use it instead of redis if your whole app is deployed through that nginx process)

If you're not looking for an IPC cache that is uwsgi_cache (that's basically what cache.ram does for single processes), maybe it'd be better to use the upstream proxy cache via proxy_cache and proxy_cache_path directive.

BTW:  $upstream_cache_status that you use shouldn't be used to see if uwsgi_cache is used at all. It's the "flag" that the upstream cache is working (proxy_cache directive) 

Lisandro

unread,
Apr 19, 2017, 9:04:59 AM4/19/17
to web2py-users

El miércoles, 19 de abril de 2017, 9:28:12 (UTC-3), Niphlod escribió:
It'd be easier to see if it works with a slightly modified directive for uwsgi_cache_valid

uwsgi_cache_valid any      1m

Because if you miss the 3 parameter notation, only 200 responses are cached.

Yes, I forgot to mention that I had try that without success.
 

Then you can try returning a proper X-Accel-Expires that should "trump" any other header.

uwsgi_cache is an application cache (read: you could use it instead of redis if your whole app is deployed through that nginx process)

If you're not looking for an IPC cache that is uwsgi_cache (that's basically what cache.ram does for single processes), maybe it'd be better to use the upstream proxy cache via proxy_cache and proxy_cache_path directive.

BTW:  $upstream_cache_status that you use shouldn't be used to see if uwsgi_cache is used at all. It's the "flag" that the upstream cache is working (proxy_cache directive) 


Right now I am using Redis to cache html responses. As I'm using several uwsgi workers, with Redis, the same cache is shared along all the processes (correct me if I'm wrong, but I think that's correct). 
Actually, I remember that previously I tried using cache.ram, but logically it didn't work, or better said, each process had its own cache, they couldn't share it, that's why I moved to Redis.

When you say "... an IPC cache that is uwsgi_cache...", do you mean that uwsgi_cache wouldn't be suitable to do caching along processes? I got lost in the translations there. 
Anyway, I must admit that I'm way beyond my limits with this, I'm having a hard time to understand all this, I think I'll have to hire someone that knows better than me :)
I've been looking for a person to work with, my first goal was to find someone here in my city, but I'm in a small city and there is no one here working with these technologies.

I don't want to bother here with things that aren't related to web2py.
Thank you very much for your time!! I'll keep digging.

Best regards,
Lisandro.
Reply all
Reply to author
Forward
0 new messages