Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

How to free memory ( ie garbage collect) at run time with Python 2.5.1(windows)

3,932 views
Skip to first unread message

rfv-370

unread,
Aug 27, 2007, 10:18:26 AM8/27/07
to
have made the following small test:

Before starting my test my UsedPhysicalMemory(PF): 555Mb

>>>tf=range(0,10000000) PF: 710Mb ( so 155Mb for my List)
>>> tf=[0,1,2,3,4,5] PF: 672Mb (Why? Why the remaining 117Mb is not freed?)
>>> del tf PF: 672Mb (unused memory not freed)

So changing the list contents and/or deleting the list changes
nothing...from a memory point of view.

This is a problem as I have several applications/threads competing for
memory share. (ie wxpython app).

So how can I force Python to clean the memory and free the memory that
is not used?


PS: gc.collect() does nothing - To return memory I need to kill my
process (ie the interpreter).

Any ideas is welcome.

Thanx

Robert

Alex Martelli

unread,
Aug 27, 2007, 10:46:29 AM8/27/07
to
rfv-370 <robert....@yahoo.fr> wrote:

> have made the following small test:
>
> Before starting my test my UsedPhysicalMemory(PF): 555Mb
>
> >>>tf=range(0,10000000) PF: 710Mb ( so 155Mb for my List)
> >>>tf=[0,1,2,3,4,5] PF: 672Mb (Why? Why the remaining 117Mb is
> >>>not freed?) del tf PF: 672Mb (unused memory
> >>>not freed)

Integer objects that are once generated are kept around in a "free list"
against the probability that they might be needed again in the future (a
few other types of objects similarly keep a per-type free-list, but I
think int is the only one that keeps an unbounded amount of memory
there). Like any other kind of "cache", this free-list (in normal
cases) hoards a bit more memory than needed, but results in better
runtime performance; anomalous cases like your example can however
easily "bust" this too-simple heuristic.

> So how can I force Python to clean the memory and free the memory that
> is not used?

On Windows, with Python 2.5, I don't know of a good approach (on Linux
and other Unix-like systems I've used a strategy based on forking, doing
the bit that needs a bazillion ints in the child process, ending the
child process; but that wouldn't work on Win -- no fork).

I suggest you enter a feature request to let gc grow a way to ask every
type object to prune its cache, on explicit request from the Python
program; this will not solve the problem in Python 2.5, but work on 3.0
is underway and this is just the right time for such requests.


Alex

MC

unread,
Aug 27, 2007, 10:48:34 AM8/27/07
to
Hi!

For windows, I had a soft for this (reduce_memory.exe). Sorry, it's not
write with Python.

If you want, I will give a URL for download it.

--
@-salutations

Michel Claveau


Marc 'BlackJack' Rintsch

unread,
Aug 27, 2007, 11:41:55 AM8/27/07
to
On Mon, 27 Aug 2007 07:18:26 -0700, rfv-370 wrote:

> have made the following small test:
>
> Before starting my test my UsedPhysicalMemory(PF): 555Mb
>
>>>>tf=range(0,10000000) PF: 710Mb ( so 155Mb for my List)
>>>> tf=[0,1,2,3,4,5] PF: 672Mb (Why? Why the remaining 117Mb is not freed?)
>>>> del tf PF: 672Mb (unused memory not freed)
>
> So changing the list contents and/or deleting the list changes
> nothing...from a memory point of view.

From the OS memory point of view to be precise. Although here the integer
cache applies as Alex pointed out, in the general case memory might be
kept allocated by Python to be re-used later. The Python allocator is
better suited for frequent allocation and deallocation of many small
objects than `malloc()` from the C standard library is. And even a
`free()` in the C standard library doesn't guarantee that memory is given
back to the OS and reported as free memory again.

Ciao,
Marc 'BlackJack' Rintsch

Peter Otten

unread,
Aug 27, 2007, 11:56:29 AM8/27/07
to
Alex Martelli wrote:

> Integer objects that are once generated are kept around in a "free list"
> against the probability that they might be needed again in the future (a

Python 2.5.1 (r251:54863, May 2 2007, 16:56:35)
[GCC 4.1.2 (Ubuntu 4.1.2-0ubuntu4)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> x = 1000
>>> y = 1000
>>> x is y
False

Why don't x and y point to the same object then?

Peter

Peter Otten

unread,
Aug 27, 2007, 12:06:16 PM8/27/07
to
Peter Otten wrote:

Oops, I think I got it now. The actual objects are freed, only the memory is
kept around for reuse with other ints...

Peter

Méta-MCI (MVP)

unread,
Aug 27, 2007, 1:31:46 PM8/27/07
to
Re!

Sended by direct (private) e-mail

@+

Michel Claveau


Méta-MCI (MVP)

unread,
Aug 27, 2007, 2:03:23 PM8/27/07
to
Aïe!

gmail said : "illegal attachment" (because .exe?)
I will try to send a zipped-file...

@+

Michel Claveau


Francesco Guerrieri

unread,
Aug 28, 2007, 3:11:08 AM8/28/07
to Peter Otten, pytho...@python.org

On my (windows) machine, only integer up to 256 are cached...
I made two dictionaries with mapping from i to id(i) and then
compared. They were equal up to 256.
In short, try your example with 256 and 257 and see what happens :-)

francesco

Bruno Desthuilliers

unread,
Aug 28, 2007, 4:24:00 AM8/28/07
to
Alex Martelli a écrit :
> rfv-370 <robert....@yahoo.fr> wrote:
(snip)

>> So how can I force Python to clean the memory and free the memory that
>> is not used?
>
> On Windows, with Python 2.5, I don't know of a good approach (on Linux
> and other Unix-like systems I've used a strategy based on forking, doing
> the bit that needs a bazillion ints in the child process, ending the
> child process; but that wouldn't work on Win -- no fork).

IIRC, Windows has it's own way to let you launch other processes, so a
similar strategy might apply here...

Delaney, Timothy (Tim)

unread,
Aug 28, 2007, 8:02:44 PM8/28/07
to pytho...@python.org
Alex Martelli wrote:

> rfv-370 <robert....@yahoo.fr> wrote:
>
>> have made the following small test:
>>
>> Before starting my test my UsedPhysicalMemory(PF): 555Mb
>>
>>>>> tf=range(0,10000000) PF: 710Mb ( so 155Mb for my List)
>>>>> tf=[0,1,2,3,4,5] PF: 672Mb (Why? Why the remaining
>>>>> 117Mb is not freed?) del tf PF: 672Mb
>>>>> (unused memory not freed)
>
> Integer objects that are once generated are kept around in a "free
> list" against the probability that they might be needed again in the
> future (a few other types of objects similarly keep a per-type
> free-list, but I think int is the only one that keeps an unbounded
> amount of memory there). Like any other kind of "cache", this
> free-list (in normal cases) hoards a bit more memory than needed, but
> results in better runtime performance; anomalous cases like your
> example can however easily "bust" this too-simple heuristic.

In particular, the *memory* for integer objects is kept around - the
free list will be as big as the total number of integer objects that
exist simultaneously (with a few caveats, but it's close enough). Small
integers are pre-created and cached, and don't use the free list - off
the top of my head, from -5..255 now to cover the entire bytes range,
plus a few negatives, but that depends on the version of python.

When you call range(), you create that number of integer objects
simultaneously, and so you have that amount of memory allocated for use
by other integers for the lifetime of the python process.

If on the other hand you only created one of those integers at a time
(e.g. by iterating over the result from an xrange() call) you would only
be using (continually re-using) a single entry in the free list in total
and the python memory usage would be much smaller.

Tim Delaney

Méta-MCI (MVP)

unread,
Aug 29, 2007, 9:21:35 AM8/29/07
to
Hi!

I've send the soft.
But, I have no news ; good news?

@+

Michel Claveau

0 new messages