Hi,
I'm trying to integrate with a python wsgi framework (Pylons). What
I've done so far is the following, please let me know if you think of
a better approach. I've run this by the Pylons user group and it
seems to be ok, I just wanted to verify that sharing the connection is
valid.
1. On app launch/load create a single, global pymongo connection
2. On each http request, instantiate a new pymongo database object
(actually a subclass)
3. Then use that database object in my controller actions.
Note: on step 2, I instantiate a new subclass of pymongo's database
per request because I was trying to figure out a way to have the
database object be 'site-aware'. Meaning, db.events.save(doc) will
behave differently depending on the requesting site. (example, if
site is
texas.myproject.com, then db.events.save(doc) will save the
doc to the 'texas_events' collection, instead of just 'events'
collection)
::Something like this is run per request::
conn = globals.mongo_conn
# get site info from wsgi environ
site = get_site_context(environ)
controller = Controller()
# pymongo db subclass, site aware
controller.db = SiteDatabase(conn, 'db_name', site)
# call the controller
controller()
# ... Then, in controller action
self.db.events.save(request.POST)
# above line saves the doc to 'subdomain_events' collection
Is it ok to share the global connection object among multiple threads
(requests) ? (seems to be working.)
Other approaches?