@cache.action not working properly with nginx, is any additional configuration required?

126 views
Skip to first unread message

Lisandro

unread,
May 20, 2015, 7:46:48 AM5/20/15
to web...@googlegroups.com
Are there any special consideration about @cache.action in production with nginx + uwsgi? I'm having problems using it, so I made a simple test to show you.

In the default welcome application, in default.py controller:

@cache.action(time_expire=30, cache_model=cache.ram, session=False, vars=False, public=True)
def test():
   
return request.now.strftime('%H:%M:%S')

Now, the first time you hit the /test url, you will see the current time. 
But if you inmediately hit the url again, you will see the time shown in the first hit. This time will be shown for 30 seconds, that is, the result of the function is cached for 30 seconds.

This works ok with web2py's embedded server. 
However, when I move it to nginx + uwsgi, @cache.action behaves strangely:
1) When I first hit the url, I see the current time.
2) Three seconds after, I hit the url again, and I don't see cached time, instead, I see again the current time.
3) Three or four seconds after, I hit the url again, and I see the first cached time.
4) I keep hitting the url, and it keeps returning previously cached values, randomly. But something is shure: it never returns a value for 30 seconds, the code is executed at least every 4 o 5 seconds. So in a period of 30 seconds, no matter how many times I hit the url, the returned values will always differ 4 o 5 seconds between them.

Am I missing something? Is it necesarry to configure some nginx's stuff? I don't think it would be necessary. 

Paolo Valleri

unread,
May 20, 2015, 8:11:58 AM5/20/15
to web...@googlegroups.com
this is the correct behavior in a multi-process/multi-thread environment because cache.ram is not shared across them.
use either memcache or redis to have a global cache

Paolo

Lisandro

unread,
May 20, 2015, 9:03:30 AM5/20/15
to web...@googlegroups.com
Thank you very much for the clarification.

Do you know about some example of nginx+uwsgi configuration with memcache? I've already read this documentation:

I've already created the model file models/0_memcache.py with this code:

from gluon.contrib.memcache import MemcacheClient
memcache_servers = ['127.0.0.1:11211']
cache.memcache = MemcacheClient(request, memcache_servers)
cache.ram = cache.disk = cache.memcache

But I'm not shure about my nginx virtual host configuration. I'm concerned about which should be the proper way of combining "all the stuff" in the nginx's virtual server configuration (by "all the stuff" I mean, gzip, location rules for static content, uwsgi pass, etc).

This is the actual nginx's virtual server configuration that I've tried, but it doesn't work (the website works ok, but still no caching).

server {
    listen       80;
    server_name  dev.demo;
    root /home/gonguinguen/medios;
    
    location ~* ^/(\w+)/static(?:/_[\d]+\.[\d]+\.[\d]+)?/(.*)$ {
        alias /home/gonguinguen/medios/applications/$1/static/$2;
        expires max;
    }
    
    location ~* ^/(\w+)/static/ {
        root /home/gonguinguen/medios/applications;
        expires max;
    }
    
    location / {
        set            $memcached_key "$uri?$args";
        memcached_pass 127.0.0.1:11211;
        error_page     404 502 504 = @fallback;
        
        uwsgi_pass      unix:///tmp/medios.socket;
        include         uwsgi_params;
        uwsgi_param     UWSGI_SCHEME $scheme;
        uwsgi_param     SERVER_SOFTWARE    nginx/$nginx_version;
    }

    location @fallback {
        uwsgi_pass      unix:///tmp/medios.socket;
    }
}




I don't know where to or how to check if memcache is caching.

Thanks in advance!

Paolo Valleri

unread,
May 20, 2015, 9:35:35 AM5/20/15
to web...@googlegroups.com
The web2py part you have posted is more than enough. Did you try it?
memcache with nginx is an other thing 


 Paolo

--
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
---
You received this message because you are subscribed to a topic in the Google Groups "web2py-users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/web2py/P_ezuB93DAY/unsubscribe.
To unsubscribe from this group and all its topics, send an email to web2py+un...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Lisandro

unread,
May 20, 2015, 10:12:48 AM5/20/15
to web...@googlegroups.com
Do you mean that it should work adding only that lines in models/0_memcache.py? I've tried that but the results are still the same :/

I was asking about the nginx/uwsgi part because I thought some configuration was required.
In case it doesn't require any additional configuration to nginx/uwsgi, then I don't understand why cache isn't working yet. I'm using web2py 2.10.4 with nginx + uwsgi. 

Also, I don't have any additional local stuff, it's just the welcome app with that simple test function in default controller, and those lines in models/0_memcache.py. All that, running with nginx and uwsgi.

Niphlod

unread,
May 20, 2015, 11:03:30 AM5/20/15
to web...@googlegroups.com
haaaaalt!. Either you use web2py (and we can help with that) OR nginx (and you'll need to cfr THEIR support for a proper config). 
There's no facility whatsoever for web2py to cache the object and for nginx to fetch it from the same memcached instance.

That being said, using cache.ram in multiprocess environment is OF COURSE pitfalled because each process would be need to be hit the first time, so what you're observing with cache.ram is entirely expected. 

Once you choose to store the cached object in some external backend (disk, redis, memcache), the "pitfall" is closed, but you don't need any additional nginx configuration.

Lisandro

unread,
May 20, 2015, 12:13:28 PM5/20/15
to web...@googlegroups.com
Thanks Niphlod, and Paolo too. With your last comments I understand a little better. Sorry about the web2py / nginx confusion. I know I'm going way beyond my limits, but for now there's no budget for more :(

I understand a bit more about the problem of caching in memory when there are multiple processes. Niphlod, based on your last sentence, I did a quick test (a test that I've sould done previously) and changed cache.ram to cache.disk, and it started working, as it would be expected.

So, instead of memcache, I read a little about redis, and I decided to give it a try.
The installation was easy:
sudo apt-get install redis-server
sudo apt
-get install python-redis

Then, somewhere in model file:
from gluon.contrib.redis_cache import RedisCache
cache.redis = RedisCache('localhost:6379',db=None, debug=True, with_lock=True)
cache.ram = cache.disk = cache.redis

This was the first quick test and it worked ok. 
I've used with_lock=True because of the comment of the book:

I'll try on production and see what happens with performance and resource usage. 
Thank you both for the help!





Reply all
Reply to author
Forward
0 new messages