Massimo
def clean(request,expiration=3600):
import os, stat, time
path=os.path.join(request.folder,'sessions')
for file in listdir(path):
filename=os.path.join(path,file)
if time.time()-os.stat(filename)
[stat.ST_MTIME]>expiration:
os.unlink(filename)
Massimo
I love making things maintenance-proof, but it's not too much to setup a
script that cron calls. I didn't know about the web2py-included script
before I wrote this one (which I still use because it's not a deamon).
I call it via window's built-in scheduler. Feel free to customize/use/sell.
----------------------
#!/usr/bin/env python
""" Cleans up yesterday's sessions. """
import os
import sys
from glob import glob
from datetime import datetime, time
print "Cleaning up old sessions:"
olddir = os.getcwd()
web2pydir = "c:/web2py/"
appsdir = os.path.join(web2pydir, 'applications')
doit = len(sys.argv) > 1 and sys.argv[1] == 'nukeem'
# Cut off time is the time after which sessions will be kept.
# in this case, it is 12:00am of today.
cutofftime = datetime.combine(datetime.now(), time(0))
os.chdir(appsdir)
#applications = glob("*")
applications = ['init', 'Formstation']
for app in applications:
sessiondir = os.path.join(appsdir, app, 'sessions')
sessions = glob(sessiondir + "/*")
for session in sessions:
accessedtime = datetime.fromtimestamp(os.stat(session).st_mtime)
if accessedtime < cutofftime:
if doit:
print "deleting %s from %s " %
(os.path.basename(session), app)
os.unlink(session)
else:
print "Would delete %s from %s " %
(os.path.basename(session)
, app)
os.chdir(olddir)
I would only endorse this system if it were also allowed to only process
a certain number of files per run. So the 19k would get expired over
the course of a few cleaning sessions (webpage requests). Granted the
expiration flag would be necessary to avoid accessing any expired sessions.
-timbo