I guess your code is something like
def clean_up(self, *a , **k):
stuff_in_base = DBSession(Stuff).all()
for s in stuff_in_base:
if match_condition_for_deletion(s)
DBSession.delete(s)
If your connection are not recycled it could hit one or more of the 5
following hypothesis
* your scheduler should queue your task to ensure parallel execution
dont reach max_connections
* you live in the 1990's and think DB connections are costly, be a
man, read SA man pages and push your PoolQueue to 100 max connections
if you need to perform more than 10 DB operations simultaneously,
(setup your db accordingly) ;
* you forgot to commit your modifications thus leaving the session
open ( import transaction / transaction.commit() )
* your code could be lacking «with» statement
http://www.python.org/dev/peps/pep-0343/
* you have a deadlock that occasionnaly stalls your connexions ;
(beware of cyclic references in your model)
* your DB is very slow and/or poorly configured (like mysql when
reaching 1 million entries in a table in MYISAM and trying to delete
entries). If you are in this configuration dont look further, mysql is
not a DB it's a toy. (it is an obvious troll dont fall for it)
Whatever your problem is, with so few informations I guess nobody will
be able to give you a more accurate answer. You should put some log in
your code (cleaning stop/start) and you should profile your DB for
slow queries & locks.
I guess you have a mix of all that.
--
cheers
Hf, gl
Julien