Usage of functools.lru_cache on coroutine decorated functions.

372 views
Skip to first unread message

Shane Spencer

unread,
Apr 29, 2013, 1:26:56 PM4/29/13
to python-...@googlegroups.com
Has anybody found a snug way of incorporating the two?  I'm either yielding or raising whenever I use tornado.gen.coroutine as a decorator of course.. however I would like to decorate the result of the function with functools.lru_cache.  The only success that I have had using lru_cache has been with synchronous calls.

I'm imagining a wrapper for lru_cache that also raises in kind or yields a result list as well.  This seems incredibly naughty however.

I typically use Redis to help with these situations.  It is simply a personal goal to see if I can use an in-process LRU with Python ready objects rather than doing another network call and deserialize the result.  The goal here is to keep a lot of lookups that need to be blinding fast in process memory while allowing them to expire.

Currently I am using a dictionary based cache to deal with this and it is not ideal.

I have also had luck with slaving to a Redis host as a Tornado socket and then placing those keys into memory and using an ioloop timeout to expire them... and then pushing to Redis with new information (which in turn comes right back to me).

All in all I'd love to just use functools.lru_cache.

- Shane

Ben Darnell

unread,
Apr 29, 2013, 10:41:56 PM4/29/13
to Tornado Mailing List
This seems to work fine (on python 3.3) -- just have lru_cache store the Futures instead of the internal results:

from tornado import gen
from tornado.ioloop import IOLoop

import functools
import time

@functools.lru_cache()
@gen.coroutine
def do_something_slow(arg1, arg2):
    print("doing something slow with %s and %s" % (arg1, arg2))
    yield gen.Task(IOLoop.current().add_timeout, time.time() + 1)
    print("finished")
    return arg1 + arg2

@gen.coroutine
def main():
    print("starting main")
    result = yield do_something_slow(2, 2)
    print("got result %s" % result)
    result = yield do_something_slow(3, 4)
    print("got result %s" % result)
    result = yield do_something_slow(2, 2)
    print("got result %s" % result)

IOLoop.current().run_sync(main)

starting main
doing something slow with 2 and 2
finished
got result 4
doing something slow with 3 and 4
finished
got result 7
got result 4



--
You received this message because you are subscribed to the Google Groups "Tornado Web Server" group.
To unsubscribe from this group and stop receiving emails from it, send an email to python-tornad...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

Shane Spencer

unread,
Apr 29, 2013, 10:50:29 PM4/29/13
to python-...@googlegroups.com
*blink* *blink*

I will.. WILL.. get this.

Shane Spencer

unread,
Apr 30, 2013, 2:44:24 AM4/30/13
to python-...@googlegroups.com
Thank you Ben, you're a champ!  MotorDB is now being cached a bit in several situations.  Unfortunately I have to bring in my motor connection instance in as a global :|.  I'll be getting around this by replacing the make_keys function used by lru_cache to omit certain kv's and argument indexes.

!! Super fast !!

- Shane

A. Jesse Jiryu Davis

unread,
May 1, 2013, 10:27:35 AM5/1/13
to python-...@googlegroups.com, sh...@bogomip.com
Huh, can you show some of your code so I understand what you're saying about making the connection instance global? I haven't used lru_cache myself but I wonder if there's a way to make your solution more elegant.

Shane Spencer

unread,
May 1, 2013, 1:33:48 PM5/1/13
to A. Jesse Jiryu Davis, python-...@googlegroups.com
Short and sweet

@functools.lru_cache(...)
def something(self, somethingsomething):
  return something

'self' is distinct and becomes part of the lru_cache keys when all I'm concerned with is somethingsomething being cached.  I'll need to override the make_keys function in order to exclude it from key hashing.

I've been looking at cache.py the motor_blog cache quite a bit and it's very similar to how I deal with a Redis based solution with eventual invalidation.

- Shane

Reply all
Reply to author
Forward
0 new messages