Memory leak using Redis for sessions

517 views
Skip to first unread message

Lisandro

unread,
Jul 25, 2022, 5:23:37 PM7/25/22
to web2py-users
I'm running web2py in production and I use a Redis server to store a couple of millons sessions, but I'm facing a memory problem that I haven't been able to fix.

I use session_expiry=172800 (two days). The application load is very stable (it handles about 60 requests per second). I thought that after a few weeks running I would know how much memory I need for Redis. However the memory used by Redis keeps increasing indefinitely.

I don't use sessions for anything more than a very small percentage of users that can login and do some administrative tasks. The vast majority of users can't login. 

Just in case, I checked and every key in redis has an expiration time:
$ redis-cli info keyspace
# Keyspace
db0:keys=1549547,expires=1548249,avg_ttl=89380135


I've also checked a few session keys and I saw that a session ocuppies about 250 bytes. However, the memory used by Redis grows slowly and constantly: in a whole year it reached the 24 gigabytes of RAM that the server has, which is insane, right? 

I had Redis configured to limit the amount of RAM it can use:
maxmemory 20gb
maxmemory-policy volatile-lru

However, as I commented before, after a whole year it reached that limit, and my app started throwing this error:

Traceback (most recent call last): File "/var/www/medios/gluon/main.py", line 462, in wsgibase session._try_store_in_db(request, response) File "/var/www/medios/gluon/globals.py", line 1226, in _try_store_in_db record_id = table.insert(**dd) File "/var/www/medios/gluon/contrib/redis_session.py", line 138, in insert newid = str(self.db.r_server.incr(self.serial)) File "/var/www/medios/venv_medios/lib/python2.7/site-packages/redis/client.py", line 651, in incr return self.execute_command('INCRBY', name, amount) File "/var/www/medios/venv_medios/lib/python2.7/site-packages/redis/client.py", line 394, in execute_command return self.parse_response(connection, command_name, **options) File "/var/www/medios/venv_medios/lib/python2.7/site-packages/redis/client.py", line 404, in parse_response response = connection.read_response() File "/var/www/medios/venv_medios/lib/python2.7/site-packages/redis/connection.py", line 316, in read_response raise response ResponseError: OOM command not allowed when used memory > 'maxmemory'.


I thought that error was impossible giving that Redis has a maxmemory limit and it is instructed to evict keys when the limit is reached. However I realised that these types of scenarios (lot of keys being written and also lot of keys being deleted) can lead to memory fragmentation. And redis has a defragmentation option, so I made some changes.

I reduced the maxmemory Redis limit and activated the auto defragmentation:
maxmemory 1gb
maxmemory-policy volatile-lru
activedefrag yes

Redis auto defragmentation works like a charm: when it wasn't active I could see that the mem_fragmentation_ratio was slowly increasing. After activating it, it stayed in a stable and optimal value of 1.05.

After a week running (remember all the sessions expire in two days) Redis was using about 600mb of RAM. But the usage kept growing and reached the maxmemory limit a few days later. 
At that point, I could verify that Redis started evicting keys to make space (that was expected accordingly to the configuration). 
However a few days later my apps again started to throw the error with the exact same traceback I posted before :/

What could be happening? I'm pretty sure that I don't need more than 1 or 2gb of RAM for handling the sessions with Redis. So why does it crash? Could it be a memory leak in gluon/contrib/redis_session.py adapter?

One thing: I've never run sessions2trash.py
But if I understand the documentation right, I don't need to run it as I set an expiration time for every session.

Let me know what you think, any help will be much appreciated.
Thanks!
Warm regards,
Lisandro

Lisandro

unread,
Aug 6, 2022, 6:58:50 AM8/6/22
to web2py-users
It's not a memory leak. You have to run sessions2trash.py periodically, even if you're using Redis and setting and expiration time for every session:
Reply all
Reply to author
Forward
0 new messages