I have a package that generates classes from a
set of XML files using exec.
So far the classes appear in the global namespace.
Is there any way to also create packages dinamicaly
and add the classes to those packages?
Thanks in advance,
Paulo Pinto
>>> import types
>>> mymodule = types.ModuleType("mymodule")
>>> exec "def demo():\n\tprint 'hello from', __name__\n" in
mymodule.__dict__
>>> mymodule.demo()
hello from mymodule
>>>
Seems to work. I haven't used it myself, though.
Peter
By packages I think you mean modules. Here is a solution in Python 2.3:
>>> from types import ModuleType
>>> mymodule=ModuleType("mymodule")
>>> print mymodule
<module 'mymodule' (built-in)>
>>> class C(object): pass
...
>>> mymodule.C=C
In older Python versions, look for the module "new".
I want to make the module available to the
caller as if he did an import.
For example, if I make the following call
some_module.generate_module('dummy')
Where some_module is the module that generates
modules dinamicaly, and dummy is the name of the
new module.
I would like to be able to do
dummy.something()
after that call.
I've discovered that if I do something like this
globals()['dummy'] = module_instance_returned_by_new.module()
It works, but it must be done at the same level I want to
call dummy.something() and not from inside some_module. Because
if I do it inside the module, globals() will be refering to the
module globals and not to parent scope.
Basically I would like to import the generated module to the
module that is invoking generate_module() like an uplevel in
Tcl.
Is this possible?
Cheers,
Paulo Pinto
> Thanks it is want I was looking for.
> However I still have a problem.
>
> I want to make the module available to the
> caller as if he did an import.
>
> For example, if I make the following call
>
> some_module.generate_module('dummy')
>
> Where some_module is the module that generates
> modules dinamicaly, and dummy is the name of the
> new module.
>
> I would like to be able to do
>
> dummy.something()
>
> after that call.
>
> I've discovered that if I do something like this
>
> globals()['dummy'] = module_instance_returned_by_new.module()
>
>
> It works, but it must be done at the same level I want to
> call dummy.something() and not from inside some_module. Because
> if I do it inside the module, globals() will be refering to the
> module globals and not to parent scope.
>
> Basically I would like to import the generated module to the
> module that is invoking generate_module() like an uplevel in
> Tcl.
>
> Is this possible?
Don't know, but rebinding in the calling scope from inside a function call
looks like fighting the language to me. Maybe redefining __import__() would
be better:
<myimport.py>
import __builtin__
import types, sys
originalimport = __builtin__.__import__
def myimport(name, *args):
print "importing", name
try:
return originalimport(name, *args)
except ImportError:
print "generating", name
module = types.ModuleType(name)
exec "def demo(*args):\n\tprint 'demo%r' % (args,)\n" in
module.__dict__
sys.modules[name] = module # put it into the cache
return module
__builtin__.__import__ = myimport
</myimport.py>
I simply generate any module that cannot successfully be imported, but you
could change this to meet your needs. Now a usage example:
<usemyimport.py>
import myimport # put it first, because it messes with the built-ins
print "first import"
import os
import generated
print
print "second import"
import os, generated
generated.demo("hi", "there")
</usemyimport.py>
However, this is just a smoother variant of the
module = generate("modulename")
pattern.
Peter
Sure
Put this in a file called "dynmods.py" or something:
from types import ModuleType
def generate_module(dynmod):
exec '%(dynmod)s = ModuleType("%(dynmod)s")' % vars() in globals()
dynmod = globals()[dynmod]
exec "def something():\n\tprint 'hello from', __name__\n" in
dynmod.__dict__
return dynmod
fire up a shell and:
>>> import dynmods
>>> dummy = dynmods.generate_module("dummy")
>>> print dummy
<module 'dummy' (built-in)>
>>> dummy.something()
hello from dummy
>>>
Is this what you were looking for?
Regards
Vincent Wehren
Well, this isn't the perfect solution, but you can create modules in
the system at runtime like so:
In your code:
>>> mymodule = generate_module() #As above
>>> import sys
>>> sys.modules['dummy'] = mymodule
Now, the user can just do the following:
>>> import dummy
>>> dummy.something()
It's not _quite_ what you want, since it doesn't automatically include
it, but it's pretty easy to do. Furthermore, if you standardize the
name of the module, you can just have the user import that by default,
and then use whatever dynamic content you've put inside.
Also, note that mymodule can actually be anything, although you should
still use a value that responds to __getattr__ (anything else would be
really confusing, and probably a Bad Thing).
I hope this helps.
- Brian Chin