Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Build classes/packages dinamicaly

1 view
Skip to first unread message

Paulo Pinto

unread,
Dec 15, 2003, 10:37:11 AM12/15/03
to
Hi,


I have a package that generates classes from a
set of XML files using exec.

So far the classes appear in the global namespace.

Is there any way to also create packages dinamicaly
and add the classes to those packages?

Thanks in advance,
Paulo Pinto

Peter Otten

unread,
Dec 15, 2003, 11:00:02 AM12/15/03
to
Paulo Pinto wrote:

>>> import types
>>> mymodule = types.ModuleType("mymodule")
>>> exec "def demo():\n\tprint 'hello from', __name__\n" in
mymodule.__dict__
>>> mymodule.demo()
hello from mymodule
>>>

Seems to work. I haven't used it myself, though.

Peter

Michele Simionato

unread,
Dec 15, 2003, 1:25:22 PM12/15/03
to
Paulo Pinto <paulo...@cern.ch> wrote in message news:<brkkf7$jqg$1...@sunnews.cern.ch>...

By packages I think you mean modules. Here is a solution in Python 2.3:

>>> from types import ModuleType
>>> mymodule=ModuleType("mymodule")
>>> print mymodule
<module 'mymodule' (built-in)>
>>> class C(object): pass
...
>>> mymodule.C=C

In older Python versions, look for the module "new".

Paulo Pinto

unread,
Dec 16, 2003, 6:22:10 AM12/16/03
to
Thanks it is want I was looking for.
However I still have a problem.

I want to make the module available to the
caller as if he did an import.

For example, if I make the following call

some_module.generate_module('dummy')

Where some_module is the module that generates
modules dinamicaly, and dummy is the name of the
new module.

I would like to be able to do

dummy.something()

after that call.

I've discovered that if I do something like this

globals()['dummy'] = module_instance_returned_by_new.module()


It works, but it must be done at the same level I want to
call dummy.something() and not from inside some_module. Because
if I do it inside the module, globals() will be refering to the
module globals and not to parent scope.

Basically I would like to import the generated module to the
module that is invoking generate_module() like an uplevel in
Tcl.

Is this possible?

Cheers,
Paulo Pinto

Peter Otten

unread,
Dec 16, 2003, 8:01:43 AM12/16/03
to
Paulo Pinto wrote:

> Thanks it is want I was looking for.
> However I still have a problem.
>
> I want to make the module available to the
> caller as if he did an import.
>
> For example, if I make the following call
>
> some_module.generate_module('dummy')
>
> Where some_module is the module that generates
> modules dinamicaly, and dummy is the name of the
> new module.
>
> I would like to be able to do
>
> dummy.something()
>
> after that call.
>
> I've discovered that if I do something like this
>
> globals()['dummy'] = module_instance_returned_by_new.module()
>
>
> It works, but it must be done at the same level I want to
> call dummy.something() and not from inside some_module. Because
> if I do it inside the module, globals() will be refering to the
> module globals and not to parent scope.
>
> Basically I would like to import the generated module to the
> module that is invoking generate_module() like an uplevel in
> Tcl.
>
> Is this possible?

Don't know, but rebinding in the calling scope from inside a function call
looks like fighting the language to me. Maybe redefining __import__() would
be better:

<myimport.py>
import __builtin__
import types, sys

originalimport = __builtin__.__import__

def myimport(name, *args):
print "importing", name
try:
return originalimport(name, *args)
except ImportError:
print "generating", name
module = types.ModuleType(name)
exec "def demo(*args):\n\tprint 'demo%r' % (args,)\n" in
module.__dict__
sys.modules[name] = module # put it into the cache
return module

__builtin__.__import__ = myimport
</myimport.py>

I simply generate any module that cannot successfully be imported, but you
could change this to meet your needs. Now a usage example:

<usemyimport.py>
import myimport # put it first, because it messes with the built-ins

print "first import"
import os
import generated
print
print "second import"
import os, generated

generated.demo("hi", "there")
</usemyimport.py>

However, this is just a smoother variant of the

module = generate("modulename")

pattern.

Peter

vincent wehren

unread,
Dec 16, 2003, 1:29:01 PM12/16/03
to

"Paulo Pinto" <paulo...@cern.ch> schrieb im Newsbeitrag
news:brmpt2$7nn$1...@sunnews.cern.ch...

| Thanks it is want I was looking for.
| However I still have a problem.
|
| I want to make the module available to the
| caller as if he did an import.
|
| For example, if I make the following call
|
| some_module.generate_module('dummy')
|
| Where some_module is the module that generates
| modules dinamicaly, and dummy is the name of the
| new module.
|
| I would like to be able to do
|
| dummy.something()
|
| after that call.
|
| I've discovered that if I do something like this
|
| globals()['dummy'] = module_instance_returned_by_new.module()
|
|
| It works, but it must be done at the same level I want to
| call dummy.something() and not from inside some_module. Because
| if I do it inside the module, globals() will be refering to the
| module globals and not to parent scope.
|
| Basically I would like to import the generated module to the
| module that is invoking generate_module() like an uplevel in
| Tcl.
|
| Is this possible?

Sure

Put this in a file called "dynmods.py" or something:

from types import ModuleType
def generate_module(dynmod):

exec '%(dynmod)s = ModuleType("%(dynmod)s")' % vars() in globals()
dynmod = globals()[dynmod]
exec "def something():\n\tprint 'hello from', __name__\n" in
dynmod.__dict__
return dynmod

fire up a shell and:

>>> import dynmods
>>> dummy = dynmods.generate_module("dummy")
>>> print dummy
<module 'dummy' (built-in)>
>>> dummy.something()
hello from dummy
>>>

Is this what you were looking for?

Regards

Vincent Wehren

Naerbnic

unread,
Dec 16, 2003, 5:07:37 PM12/16/03
to
> I want to make the module available to the
> caller as if he did an import.
>
> For example, if I make the following call
>
> some_module.generate_module('dummy')
>
> Where some_module is the module that generates
> modules dinamicaly, and dummy is the name of the
> new module.
>
> I would like to be able to do
>
> dummy.something()
>
> after that call.

Well, this isn't the perfect solution, but you can create modules in
the system at runtime like so:

In your code:
>>> mymodule = generate_module() #As above
>>> import sys
>>> sys.modules['dummy'] = mymodule

Now, the user can just do the following:
>>> import dummy
>>> dummy.something()

It's not _quite_ what you want, since it doesn't automatically include
it, but it's pretty easy to do. Furthermore, if you standardize the
name of the module, you can just have the user import that by default,
and then use whatever dynamic content you've put inside.

Also, note that mymodule can actually be anything, although you should
still use a value that responds to __getattr__ (anything else would be
really confusing, and probably a Bad Thing).

I hope this helps.

- Brian Chin

Paulo Pinto

unread,
Dec 17, 2003, 5:10:09 AM12/17/03
to
Thanks for the replies.
I have now a working solution.

0 new messages