Here are some problems and solutions for running web2py on appengine.
I am using these techniques at
http://ru.ly in production on
appengine.
1) cPickle does not correctly alias to pickle
It cPickle works in the SDK, but not on Google's servers:
http://code.google.com/p/googleappengine/issues/detail?id=284&q=pickle&colspec=ID%20Type%20Status%20Priority%20Stars%20Owner%20Summary
Solution, where cPickle is imported:
import pickle as cPickle
2) the wsgihandler is reloaded on *every* request
Solution, let Google cache main() in wsgihandler.py:
def main():
import sys, os
path=os.path.dirname(os.path.abspath(__file__))
if not path in sys.path: sys.path.append(path)
import gluon.main
application=gluon.main.wsgibase
if __name__ == "__main__":
main()
3) @cache caches POST requests by default
Since forms are usually self submitting, you have to be careful about
caching form submissions. Instead of checking for 'POST' in every
function, I made a ConditionalCache:
cache_enabled = True
def default_cache_conditions():
return cache_enabled and
request.env.request_method=='GET'
from gluon.cache import Cache
class ConditionalCache(Cache):
def
__call__(self,key=None,time_expire=300,cache_model=None,test=default_cache_conditions):
if not cache_model: cache_model=self.ram
if test:
def tmp(func):
if test():
return lambda: cache_model(key,func,time_expire)
else:
return func
else:
def tmp(func):
return lambda: cache_model(key,func,time_expire)
return tmp
cache = ConditionalCache(request)
4) no cron tasks for stale session clean cleanup
Solution, store sessions in memcache:
from gluon.contrib.memdb import *
from google.appengine.api.memcache import Client
session.connect(request,response,db=MEMDB(Client()))
This trick will work with google memcache and regular memcache, is
faster than hitting the datastore, and you never have to explicitly
cleanup the sessions.
If there is interest, I will submit patches once these features are
stable.
Robin