Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Creating a "package" using C extensions?

0 views
Skip to first unread message

Courageous

unread,
Dec 8, 2001, 1:59:15 PM12/8/01
to

So. I've written Python C extensions before. And I've
written Python packages before, with subdirectories and
__init__.py files and so forth before. What I'm unclear
on is the proper way to combine these.

I can see that if I just used a normal __init__.py and
then pointed it to C extension dlls, this would probably
work.

Is this the approved way of doing this?

Is there a way to govern the entire thing entirely from
C? IOW, is there a __init__.py equivalent for C modules?

C//

Gordon McMillan

unread,
Dec 9, 2001, 11:32:14 AM12/9/01
to
Courageous wrote:

>
> So. I've written Python C extensions before. And I've
> written Python packages before, with subdirectories and
> __init__.py files and so forth before. What I'm unclear
> on is the proper way to combine these.
>
> I can see that if I just used a normal __init__.py and
> then pointed it to C extension dlls, this would probably
> work.
>
> Is this the approved way of doing this?

Yes.



> Is there a way to govern the entire thing entirely from
> C? IOW, is there a __init__.py equivalent for C modules?

You could probably hack something up by butchering the
module object in some way, but the import code assumes that
extension modules are just plain modules, not packages.

-- Gordon
http://www.mcmillan-inc.com/

Courageous

unread,
Dec 9, 2001, 12:09:38 PM12/9/01
to

>> I can see that if I just used a normal __init__.py and
>> then pointed it to C extension dlls, this would probably
>> work.
>>
>> Is this the approved way of doing this?
>
>Yes.

>You could probably hack something up by butchering the

>module object in some way, but the import code assumes that
>extension modules are just plain modules, not packages.

This is liveable, but somewhat painful for me. Let me tell
you why. I am currently translating the back end of a system
originally written in Python to a hybridized C++/Python
implementation with a C++ core. The heart of the system has
a very high invocation frequency, so it makes sense to put
that into C++. However, I need to make the prior interface
to the Python programmer the same as it was before. So what
I am doing is, for every module in the prior package, creating
a C++ equivalent.

That's a lot of .dlls, if you know what I mean.

Perhaps I'll break down and write a makefile. Doing a couple
dozen .dlls in VC 6.0 is a bit of a pain.

It's too bad there's not a way to create a package-level .dll
which responds to Python in such a way as to offer up its
internal modules, but all from within a single .dll. That would
be cool.

C//

Gordon McMillan

unread,
Dec 9, 2001, 1:53:47 PM12/9/01
to
Courageous wrote:

[wants to implement a package in C]



>>You could probably hack something up by butchering the
>>module object in some way, but the import code assumes that extension
>>modules are just plain modules, not packages.
>
> This is liveable, but somewhat painful for me. Let me tell
> you why. I am currently translating the back end of a system
> originally written in Python to a hybridized C++/Python
> implementation with a C++ core. The heart of the system has
> a very high invocation frequency, so it makes sense to put
> that into C++. However, I need to make the prior interface
> to the Python programmer the same as it was before. So what
> I am doing is, for every module in the prior package, creating
> a C++ equivalent.
>
> That's a lot of .dlls, if you know what I mean.
>
> Perhaps I'll break down and write a makefile. Doing a couple
> dozen .dlls in VC 6.0 is a bit of a pain.
>
> It's too bad there's not a way to create a package-level .dll
> which responds to Python in such a way as to offer up its
> internal modules, but all from within a single .dll. That would
> be cool.

Oh. If that's what you want, you asked the wrong question! Except
for a few corner cases, there's no difference between package.module
and module.attribute. Once the import is done, it's all
object.attribute anyway. And when importing, there's no difference
(except implementation, of course) between package.module and
module.attribute.

So implement <package>.c as your extension module, and arrange
for it to have module1, module2 etc. as attributes. Probably a
dozen or two lines of perfectly kosher code in init<package>.

-- Gordon
http://www.mcmillan-inc.com/

Chris Barker

unread,
Dec 10, 2001, 1:33:02 PM12/10/01
to Courageous
Courageous wrote:

> It's too bad there's not a way to create a package-level .dll
> which responds to Python in such a way as to offer up its
> internal modules, but all from within a single .dll. That would
> be cool.


Gordon McMillan wrote:
> So implement <package>.c as your extension module, and arrange
> for it to have module1, module2 etc. as attributes. Probably a
> dozen or two lines of perfectly kosher code in init<package>.

IF you figure out how to do this, I'd love to see an example. It owuld
be prettyhandy for me as well. Maybe even a cookbook entry?

-Chris


--
Christopher Barker,
Ph.D.
ChrisH...@attbi.net --- --- ---
---@@ -----@@ -----@@
------@@@ ------@@@ ------@@@
Oil Spill Modeling ------ @ ------ @ ------ @
Water Resources Engineering ------- --------- --------
Coastal and Fluvial Hydrodynamics --------------------------------------
------------------------------------------------------------------------

Courageous

unread,
Dec 10, 2001, 1:31:02 PM12/10/01
to

>IF you figure out how to do this, I'd love to see an example. It owuld
>be prettyhandy for me as well. Maybe even a cookbook entry?

I've figured it out. It's not hard at all, however there is a small
amount of repetitive labor involved. Here is the init method for
my "package" that I'm developing:

void _declspec(dllexport) initlrammp()
{
PyObject* m = Py_InitModule3("lrammp", LrammpModuleMethods, LrammpModuleDocs );
PyObject* d = PyModule_GetDict(m);

PyDict_SetItemString(d, "simulator", __LRAMMP__SimulatorModuleInit());
PyDict_SetItemString(d, "agent", __LRAMMP__AgentModuleInit());
}

Note the specific mutations on the "lrammp" module's dictionary at
init time. I am adding modules, as defined in the __LRAMMP__* forms
directly to the dictionary. These are taken from seperate source files
that have "pseudo init" methods of their own. They're not _declspecd
and aren't intended to invoked directly by Python.

This seems to work in testing my .py file. For example, the following
seems to work as expected:

from lrammp import simulator
from lrammp import agent

Likewise what is imported appears to be a proper module.

This functionality seems to fit the bill for me.

Not having to produce a couple dozen .dlls is a relief. :)

C//


Steven Majewski

unread,
Dec 10, 2001, 4:50:48 PM12/10/01
to

On Sun, 9 Dec 2001, Courageous wrote:

> This is liveable, but somewhat painful for me. Let me tell
> you why. I am currently translating the back end of a system
> originally written in Python to a hybridized C++/Python
> implementation with a C++ core. The heart of the system has
> a very high invocation frequency, so it makes sense to put
> that into C++. However, I need to make the prior interface
> to the Python programmer the same as it was before. So what
> I am doing is, for every module in the prior package, creating
> a C++ equivalent.
>
> That's a lot of .dlls, if you know what I mean.
>
> Perhaps I'll break down and write a makefile. Doing a couple
> dozen .dlls in VC 6.0 is a bit of a pain.
>
> It's too bad there's not a way to create a package-level .dll
> which responds to Python in such a way as to offer up its
> internal modules, but all from within a single .dll. That would
> be cool.

Before Jack fixed up the current Carbon modules for MacOSX Python,
I had problems building them as separate shared libraries because
they had so many shared routines between them ( for example, a lot
of other toolbox modules use Mac Resources, so they use routines
in Resmodule, but directly, not by importing the module. ).
My temporary solution was to build ALL of the Carbon toolbox modules
into a single Carbonmodule.so .


Using this macro:

#define ADD_MODULE(x) if (-1 == \
PyModule_AddObject( m, (x), PyImport_ImportModule((x)))) return NULL

for each submodule I had (for an example, for the Win module) the lines:

initWin()
ADD_MODULE("Win")


This put everything into a single shared library.

You could then do, for example:

from Carbon import Win

or, once you have done an initial:

import Carbon

all of the submodules are added to sys.modules, so you could also do:

import Win

This was intentional, so that once Carbon had been imported,
(perhaps in site.py ) old scripts that used the latter import
would still work.

I don't think it followed the package semantics entirely --
I don't think you could do:

import Carbon.Win


(I'm not entirely sure -- since Jack came up with a better scheme
I don't have that code nearby any longer. It's been archived. )

But if you want to avoid having a whole big bunch of DLL, you
could do something similar.


-- Steve Majewski

Alex Martelli

unread,
Dec 11, 2001, 10:10:21 AM12/11/01
to
"Chris Barker" <chrish...@attbi.com> wrote in message
news:3C14FFDE...@attbi.com...
...

> IF you figure out how to do this, I'd love to see an example. It owuld
> be prettyhandy for me as well. Maybe even a cookbook entry?

You mean, something like the following pa.py...:

#include <Python.h>

static PyMethodDef nomethods[] = {
{NULL, NULL}
};

void initmod1()
{
PyObject* m = Py_InitModule("pa.mod1", nomethods);
/* add module attributes, if any */
PyModule_AddStringConstant(m, "foo", "bar1");
}

void initmod2()
{
PyObject* m = Py_InitModule("pa.mod2", nomethods);
/* add module attributes, if any */
PyModule_AddStringConstant(m, "foo", "bar2");
}

void initpa()
{
PyObject* module;
PyObject* package = Py_InitModule("pa", nomethods);
if(!package) return;

/* add package attributes, if any */
PyModule_AddStringConstant(package, "foo", "bar");

module = PyImport_AddModule("pa.mod1");
if(!module) return;
if(PyModule_AddObject(package, "mod1", module))
return;
Py_INCREF(module);
initmod1();

module = PyImport_AddModule("pa.mod2");
if(!module) return;
if(PyModule_AddObject(package, "mod2", module))
return;
Py_INCREF(module);
initmod2();
}

Of course, in real life you'd no doubt use something more
substantial as the package's contents, and modules'
contents, rather than these feeble "nomethods" and string
constants, but, is this the gist of what you mean?

I could surely post it as a cookbook recipe, of course.
BTW, the setup.py to build this on any platform, just
for completeness (but it IS rather obvious of course):

from distutils.core import setup, Extension

setup (name = "pa",
version = "1.0",
maintainer = "Alex Martelli",
maintainer_email = "al...@aleax.it",
description = "Sample Python multimodule package",

ext_modules = [Extension('pa',sources=['pa.c'])]
)


Waiting for some feedback (I may have misunderstood, or
made some silly mistake, as I just put this together:-)
before making a recipe of it...


Alex

0 new messages