Here is what I get on 32-bit architecture:
cat /proc/meminfo
MemTotal: 8309860 kB
MemFree: 5964888 kB
Buffers: 84396 kB
Cached: 865644 kB
SwapCached: 0 kB
......
The program output:
N 131072
SIZE 100 MB
SIZE 200 MB
SIZE 300 MB
SIZE 400 MB
SIZE 500 MB
SIZE 600 MB
SIZE 700 MB
SIZE 800 MB
SIZE 900 MB
SIZE 1000 MB
SIZE 1100 MB
SIZE 1200 MB
SIZE 1300 MB
SIZE 1400 MB
SIZE 1500 MB
SIZE 1600 MB
SIZE 1700 MB
SIZE 1800 MB
SIZE 1900 MB
SIZE 2000 MB
SIZE 2100 MB
SIZE 2200 MB
SIZE 2300 MB
SIZE 2400 MB
SIZE 2500 MB
SIZE 2600 MB
SIZE 2700 MB
SIZE 2800 MB
SIZE 2900 MB
SIZE 3000 MB
Traceback (most recent call last):
File "bs.py", line 14, in ?
data = struct.pack("%ss" % (bs,), "")
MemoryError
++++++++++++++++++++
The number of list elements for a given block size is 131072.
If I change block size the script traces back at the same total size 3000MB.
Somewhere I read that list could have
2147483647 items, on most platforms.
Somewhere else that it is
*536,870,912
(http://stackoverflow.com/questions/855191/how-big-can-a-python-array-get)**
*But what is the maximal size of the whole list including the size of
its elements?
Thanks.
But it would answer that question pretty fast. Because then you'd see
that all list-object-methods are defined in terms of Py_ssize_t, which
is an alias for ssize_t of your platform. 64bit that should be a 64bit long.
Diez
But it would answer that question pretty fast. Because then you'd see
Then how do explain the program output?
Alex.
What exactly? That after 3GB it ran out of memory? Because you don't
have 4GB memory available for processes.
Diez
Because it has no finite answer
> What is the total maximal size of list including size of its elements?
In theory, unbounded. In practice, limited by the memory of the interpreter.
The maximum # of elements depends on the interpreter. Each element can
be a list whose maximum # of elements ..... and recursively so on...
Terry Jan Reedy
Did you see my posting?
....
Here is what I get on 32-bit architecture:
cat /proc/meminfo
MemTotal: 8309860 kB
MemFree: 5964888 kB
Buffers: 84396 kB
Cached: 865644 kB
SwapCached: 0 kB
.....
I have more than 5G in memory not speaking of swap space.
I am not asking about maximum numbers of elements I am asking about
total maximal size of list including size of its elements. In other
words:
if size of each list element is ELEMENT_SIZE and all elements have the
same size what would be the maximal number of these elements in 32 -
bit architecture?
I see 3 GB, and wonder why? Why not 2 GB or not 4 GB?
AlexM
AlexM
At a guess you were running this in 32-bit Windows. By default it reserves the
upper two gig of address space for mapping system DLLs. It can be configured to
use just 1 gig for that, and it seems like your system is, or you're using some
other system with that kind of behavior, or, it's just arbitrary...
Cheers & hth.,
- Alf (by what mechanism do socks disappear from the washer?)
Yes, I saw your posting. 32Bit is 32Bit. Do you know about PAE?
http://de.wikipedia.org/wiki/Physical_Address_Extension
Just because the system can deal with more overall memory - one process
can't get more than 4 GB (or even less, through re-mapped memory).
Except it uses specific APIs like the old hi-mem-stuff under DOS.
Diez
No, it is 32-bit Linux.
Alex
I already answered that (as did Alf, the principle applies for both OSs)
- kernel memory space is mapped into the address-space, reducing it by 1
or 2 GB.
Diez
Yes, I do. Good catch! I have PAE enabled, but I guess I have compiled
python without extended memory. So I was looking in the wrong place.
Thanks!
AlexM
You can't compile it with PAE. It's an extension that doesn't make sense
in a general purpose language. It is used by Databases or some such,
that can hold large structures in memory that don't need random access,
but can cope with windowing.
Diez
Well, there actually is a way of building programs that may use more
than 4GB of memory on 32 machines for Linux with higmem kernels, but I
guess this would not work for python.
I'll just switch to 64-bit architecture.
Thanks again.
AlexM
As I said, it's essentially paging:
http://kerneltrap.org/node/2450
And it's not something you can just compile in, you need explicit
code-support for it. Which python hasn't. And most other programs. So
there is not a magic compile option.
> I'll just switch to 64-bit architecture.
That's the solution, yes :)
Diez
Beware, sys.getsizeof(alist) is 4*len(alist) + a bit, regardless of
alists's contents ?!
See http://stackoverflow.com/questions/2117255/python-deep-getsizeof-list-with-contents
cheers
-- denis