Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

NumPy arrays that use memory allocated from other libraries or tools

96 views
Skip to first unread message

Travis Oliphant

unread,
Sep 10, 2008, 12:39:26 AM9/10/08
to pytho...@python.org, rpy-...@lists.sourceforge.net, ctypes...@lists.sourceforge.net

I wanted to point anybody interested to a blog post that describes a
useful pattern for having a NumPy array that points to the memory
created by a different memory manager than the standard one used by
NumPy. The pattern shows how to create a NumPy array that points to
previously allocated memory and then shows how to construct an object
that allows the correct deallocator to be called when the NumPy array is
freed.

This may be useful if you are wrapping code that has it's own memory
management scheme. Comments and feedback is welcome.

The post is

http://blog.enthought.com/?p=62


Best regards,

-Travis Oliphant

sturlamolden

unread,
Sep 10, 2008, 2:15:26 PM9/10/08
to
On Sep 10, 6:39 am, Travis Oliphant <oliphant.tra...@ieee.org> wrote:

> I wanted to point anybody interested to a blog post that describes a
> useful pattern for having a NumPy array that points to the memory
> created by a different memory manager than the standard one used by
> NumPy.


Here is something similar I have found useful:

There will be a new module in the standard library called
'multiprocessing' (cf. the pyprocessing package in cheese shop). It
allows you to crerate multiple processes (as opposed to threads) for
concurrency on SMPs (cf. the dreaded GIL).

The 'multiprocessing' module let us put ctypes objects in shared
memory segments (processing.Array and processing.Value). It has it's
own malloc, so there is no 4k (one page) lower limit on object size.
Here is how we can make a NumPy ndarray view the shared memory
referencey be these objects:

try:
import processing
except:
import multiprocessing as processing

import numpy, ctypes

_ctypes_to_numpy = {
ctypes.c_char : numpy.int8,
ctypes.c_wchar : numpy.int16,
ctypes.c_byte : numpy.int8,
ctypes.c_ubyte : numpy.uint8,
ctypes.c_short : numpy.int16,
ctypes.c_ushort : numpy.uint16,
ctypes.c_int : numpy.int32,
ctypes.c_uint : numpy.int32,
ctypes.c_long : numpy.int32,
ctypes.c_ulong : numpy.int32,
ctypes.c_float : numpy.float32,
ctypes.c_double : numpy.float64
}

def shmem_as_ndarray( array_or_value ):

""" view processing.Array or processing.Value as ndarray """

obj = array_or_value._obj
buf = obj._wrapper.getView()
try:
t = _ctypes_to_numpy[type(obj)]
return numpy.frombuffer(buf, dtype=t, count=1)
except KeyError:
t = _ctypes_to_numpy[obj._type_]
return numpy.frombuffer(buf, dtype=t)

With this simple tool we can make processes created by multiprocessing
work with ndarrays that reference the same shared memory segment. I'm
doing some scalability testing on this. It looks promising :)

Travis Oliphant

unread,
Sep 11, 2008, 9:45:27 AM9/11/08
to pytho...@python.org


Hey, that is very neat.

Thanks for pointing me to it. I was not aware of this development in
multiprocessing.


-Travis

0 new messages