On Tue, Jan 15, 2013 at 8:48 PM, Hetii <
ghe...@gmail.com> wrote:
> Even when i dump all of them into declarative base model, its still huge
> amount of data that need to be parsed and loaded.
>
> I want to ask if its possible to share table/column definition across
> different database models to reduce amount of used resources?
Even if you can't share the objects themselves (not sure you can, you
probably can't), you can share the code that generates them.
Remember python is dynamic, and class blah is code that actually
creates a class object:
def init():
class Blah
class Blah
return locals()
globals().update(init())
^
You can call that init function as many times as you want. I use
something like that to map two identical databases (master and
replica) into two namespaces I can pick depending on the task.