Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Large arrays in c#

93 views
Skip to first unread message

Clive Tooth

unread,
Feb 9, 2012, 8:22:17 AM2/9/12
to
This works ok...
const int n = 500000000;
int[] a = new int[n];
a[1] = 2;

This fails with "Exception of type 'System.OutOfMemoryException' was
thrown."...
const int n = 600000000;
int[] a = new int[n];
a[1] = 2;

There seems to be a limit of 2GB on any array.

# If there is such a limit, where is it stated?
# How do I declare an array of more than 536,870,912 (=2^29) integers?

My computer has 16GB of RAM.

I am using Visual C# 2010 Express, Windows 7 Home Premium Version 6.1
(build 7601: Service Pack 1)

Microsoft Visual Studio 2010
Version 10.0.30319.1 RTMRel
Microsoft .NET Framework
Version 4.0.30319 RTMRel

--
Clive Tooth

Marcel Müller

unread,
Feb 9, 2012, 9:23:00 AM2/9/12
to
On 09.02.2012 14:22, Clive Tooth wrote:
> This works ok...
> const int n = 500000000;
> int[] a = new int[n];
> a[1] = 2;
>
> This fails with "Exception of type 'System.OutOfMemoryException' was
> thrown."...
> const int n = 600000000;
> int[] a = new int[n];
> a[1] = 2;
>
> There seems to be a limit of 2GB on any array.

No .NET object can exceed 2GB size. The allocator does not support this.
Because of some internal overhead, the practical maximum is somewhat less.

> # If there is such a limit, where is it stated?
> # How do I declare an array of more than 536,870,912 (=2^29) integers?

I would not recommend to allocate that large memory in one block for
several reasons.
Dealing with fragmentation in objects of that dimension will add no
significant overhead. But dealing with address space fragmentation will
be a serious point. There are BigArray implementations available.

> My computer has 16GB of RAM.

This does not count, since the allocation is done in virtual memory. As
long as you do not access all these elements, most operating systems
will not associate physical memory with the address space. So the
important point is that you have a 64 bit platform and a 64 bit runtime.


Marcel

Peter Duniho

unread,
Feb 9, 2012, 10:16:46 AM2/9/12
to
On Thu, 09 Feb 2012 15:23:00 +0100, Marcel Müller wrote:

> [...]
>> # If there is such a limit, where is it stated?
>> # How do I declare an array of more than 536,870,912 (=2^29) integers?
>
> I would not recommend to allocate that large memory in one block for
> several reasons.
> Dealing with fragmentation in objects of that dimension will add no
> significant overhead. But dealing with address space fragmentation will
> be a serious point. There are BigArray implementations available.

Address space fragmentation is not an issue on 64-bit Windows. Not for a
"small" array as in this example, anyway.

If the OP is trying to allocate such a large object on 32-bit Windows, then
yes...that's a bad idea. It's not even possible, due to the 2GB
per-process limit on virtual address space, but of course even large
allocations that are possible would be a bad idea.

But presumably this is for a 64-bit process running on 64-bit Windows. A
2GB allocation is really no big deal in that environment. There's a
ginormous amount of virtual address space to work with in that case (to use
the technical term :) ).

>> My computer has 16GB of RAM.
>
> This does not count, since the allocation is done in virtual memory. As
> long as you do not access all these elements, most operating systems
> will not associate physical memory with the address space. So the
> important point is that you have a 64 bit platform and a 64 bit runtime.

Yes, and it _ought_ to work. The .NET array types even support 64-bit
array lengths (LongLength property) in theory. The problem is that you
can't get .NET itself to allocate an array that large.

It's one of my pet peeves: 64-bit programs in .NET aren't really fully
64-bits. You get the advantages of the larger address space and higher
per-process memory limits, but individual objects are still limited in size
(as discussed here). It's very annoying. :(

Pete

Matt

unread,
Feb 9, 2012, 4:21:32 PM2/9/12
to
That's correct, it was a conscious decision on the part of the
designers.
You can read one of the guy's motivations here:

http://blogs.msdn.com/b/joshwil/archive/2005/08/10/450202.aspx

It also shows a pretty nice way to get around the problem :)

>
> It's one of my pet peeves: 64-bit programs in .NET aren't really fully
> 64-bits.  You get the advantages of the larger address space and higher
> per-process memory limits, but individual objects are still limited in size
> (as discussed here).  It's very annoying.  :(

I agree, and likely it will be fixed in a future release. However,
realistically,
how often do you really want to be able to allocate 2G of memory in a
single
block? (And, as I type this, I can hear Bill Gates chuckling about his
256k
comment).

matt

Peter Duniho

unread,
Feb 9, 2012, 8:46:44 PM2/9/12
to
On Thu, 9 Feb 2012 13:21:32 -0800 (PST), Matt wrote:

> [...]
> That's correct, it was a conscious decision on the part of the
> designers.

Well, I don't think anyone was suggesting that .NET left large object
allocations out accidentally! :)

> You can read one of the guy's motivations here:
>
> http://blogs.msdn.com/b/joshwil/archive/2005/08/10/450202.aspx
>
> It also shows a pretty nice way to get around the problem :)

I think all three options are hacks, relative to those people who actually
need large objects to be allocated. Yes, they work. And no, they are not
terribly complicated. But I'd use the word "nice" only inasmuch as we
acknowledge that the need for the work-arounds in the first place is
unfortunate.

>> It's one of my pet peeves: 64-bit programs in .NET aren't really fully
>> 64-bits.  You get the advantages of the larger address space and higher
>> per-process memory limits, but individual objects are still limited in size
>> (as discussed here).  It's very annoying.  :(
>
> I agree, and likely it will be fixed in a future release.

This, I doubt. The article you referenced hedged a bit, but note that it
was written prior to the actual release of v2.0. 4.5 is about to come out,
and we still don't see any change in this particular issue.

I'm not holding my breath.

> However,
> realistically,
> how often do you really want to be able to allocate 2G of memory in a
> single
> block? (And, as I type this, I can hear Bill Gates chuckling about his
> 256k
> comment).

"How often" is the wrong question. Much of any framework, including .NET,
would be exclude if that was the metric.

"How useful" is more relevant, and there is a significant subset of
computing today in which handling large data sets is in fact useful.

Pete

Marcel Müller

unread,
Feb 10, 2012, 3:57:53 AM2/10/12
to
On 10.02.2012 02:46, Peter Duniho wrote:
> On Thu, 9 Feb 2012 13:21:32 -0800 (PST), Matt wrote:
>
>> [...]
>> That's correct, it was a conscious decision on the part of the
>> designers.
>
> Well, I don't think anyone was suggesting that .NET left large object
> allocations out accidentally! :)

Most likely it is an optimization.


>> I agree, and likely it will be fixed in a future release.
>
> This, I doubt. The article you referenced hedged a bit, but note that it
> was written prior to the actual release of v2.0. 4.5 is about to come out,
> and we still don't see any change in this particular issue.

It is not a question of the .NET version. It is only a question of the
VM. E.g. the Mono .NET VM does not have this restriction on 64 bit hosts.


Marcel

Peter Duniho

unread,
Feb 10, 2012, 12:49:44 PM2/10/12
to
On Fri, 10 Feb 2012 09:57:53 +0100, Marcel Müller wrote:

> [...]
>>> I agree, and likely it will be fixed in a future release.
>>
>> This, I doubt. The article you referenced hedged a bit, but note that it
>> was written prior to the actual release of v2.0. 4.5 is about to come out,
>> and we still don't see any change in this particular issue.
>
> It is not a question of the .NET version. It is only a question of the
> VM.

I don't understand that statement. The CLR implementation for .NET is
exactly dependent on the version.

> E.g. the Mono .NET VM does not have this restriction on 64 bit hosts.

Mono isn't even .NET. It's a "completely different" (not counting the code
base it stated from) implementation of the CLR. It's not relevant at all
to the question at hand, of whether the .NET implementation of the CLR will
ever support allocations > 2GB.

Pete

Arne Vajhøj

unread,
Feb 14, 2012, 9:19:56 PM2/14/12
to
It does not matter much that LongLength exist. If it did not
it could be added.

The question is what to do with all the code that uses
Length.

And because arrays and collections interact in many ways
then making arrays "64 bit capable" would also require
collections to become "64 bit capable".

And then there is serialization between 32 and 64 bit systems.

It will not be a simple change.

Arne

Arne Vajhøj

unread,
Feb 14, 2012, 9:23:20 PM2/14/12
to
AFAIK then Mono does not officially support Windows in 64 bit mode.

Arne


Peter Duniho

unread,
Feb 14, 2012, 11:24:23 PM2/14/12
to
On Tue, 14 Feb 2012 21:19:56 -0500, Arne Vajhøj wrote:

> On 2/9/2012 10:16 AM, Peter Duniho wrote:
>> On Thu, 09 Feb 2012 15:23:00 +0100, Marcel Müller wrote:
>>> This does not count, since the allocation is done in virtual memory. As
>>> long as you do not access all these elements, most operating systems
>>> will not associate physical memory with the address space. So the
>>> important point is that you have a 64 bit platform and a 64 bit runtime.
>>
>> Yes, and it _ought_ to work. The .NET array types even support 64-bit
>> array lengths (LongLength property) in theory. The problem is that you
>> can't get .NET itself to allocate an array that large.
>>
>> It's one of my pet peeves: 64-bit programs in .NET aren't really fully
>> 64-bits. You get the advantages of the larger address space and higher
>> per-process memory limits, but individual objects are still limited in size
>> (as discussed here). It's very annoying. :(
>
> It does not matter much that LongLength exist. If it did not
> it could be added.

It "matters" inasmuch as it suggests someone was actually thinking about
array lengths > 2^32-1 elements, but did not go "all the way". Which was
the point of what I wrote.

> The question is what to do with all the code that uses
> Length.

Such code is still limited to 2^32-1 elements. But first, note that the
limit is 2GB objects. You can have (many) fewer than 2^32-1 elements and
still take advantage of an object size larger than 2GB.

So you are not really talking about the same thing that the OP is asking
about.

Second, code that was limited to using the Length property would simply be
restricted to arrays with fewer than 2^32-1 elements. How best to enforce
this is debatable, but IMHO the best way would be to throw an exception if
the Length property is used for an array where the length is larger than an
int value can store.

Most code would "just work". After all, it's unlikely (not impossible, but
not likely) an array would wind up with too many elements for the Length
property to be valid if the code wasn't already aware of larger arrays.

> And because arrays and collections interact in many ways
> then making arrays "64 bit capable" would also require
> collections to become "64 bit capable".
>
> And then there is serialization between 32 and 64 bit systems.
>
> It will not be a simple change.

No one said it would. But it's well within the "normal" in terms of the
kind of development resources that are invested in .NET. Lots of new
features in .NET and C# have appeared that require much more work than
enabling object allocation sizes larger than 2GB and array/collection sizes
greater than 2^32-1 (noting that the two are related, but independent
issues).

Pete

Clive Tooth

unread,
Feb 16, 2012, 8:15:34 AM2/16/12
to
Thanks very much to all contributors to this thread. I have now
implemented my own 'BigArray' class that allows me to create the large
structures that I want to use.

--
Clive Tooth

Arne Vajhøj

unread,
Feb 16, 2012, 9:04:14 PM2/16/12
to
On 2/14/2012 11:24 PM, Peter Duniho wrote:
> On Tue, 14 Feb 2012 21:19:56 -0500, Arne Vajhøj wrote:
>> On 2/9/2012 10:16 AM, Peter Duniho wrote:
>>> On Thu, 09 Feb 2012 15:23:00 +0100, Marcel Müller wrote:
>>>> This does not count, since the allocation is done in virtual memory. As
>>>> long as you do not access all these elements, most operating systems
>>>> will not associate physical memory with the address space. So the
>>>> important point is that you have a 64 bit platform and a 64 bit runtime.
>>>
>>> Yes, and it _ought_ to work. The .NET array types even support 64-bit
>>> array lengths (LongLength property) in theory. The problem is that you
>>> can't get .NET itself to allocate an array that large.
>>>
>>> It's one of my pet peeves: 64-bit programs in .NET aren't really fully
>>> 64-bits. You get the advantages of the larger address space and higher
>>> per-process memory limits, but individual objects are still limited in size
>>> (as discussed here). It's very annoying. :(
>>
>> It does not matter much that LongLength exist. If it did not
>> it could be added.
>
> It "matters" inasmuch as it suggests someone was actually thinking about
> array lengths> 2^32-1 elements, but did not go "all the way". Which was
> the point of what I wrote.

Solving 1/10000 of the problem is not really solving anything.

Not only did they no go all the way - they made a step of
0.01 mm forward.


>> The question is what to do with all the code that uses
>> Length.
>
> Such code is still limited to 2^32-1 elements. But first, note that the
> limit is 2GB objects. You can have (many) fewer than 2^32-1 elements and
> still take advantage of an object size larger than 2GB.
>
> So you are not really talking about the same thing that the OP is asking
> about.

No.

But given that I did not quote anything from original post, then
that should not surprise anyone.

> Second, code that was limited to using the Length property would simply be
> restricted to arrays with fewer than 2^32-1 elements. How best to enforce
> this is debatable, but IMHO the best way would be to throw an exception if
> the Length property is used for an array where the length is larger than an
> int value can store.

I completely agree that exception would be necessary.

> Most code would "just work". After all, it's unlikely (not impossible, but
> not likely) an array would wind up with too many elements for the Length
> property to be valid if the code wasn't already aware of larger arrays.

No. Very likely.

All libraries that take arrays or collections as input.

>> And because arrays and collections interact in many ways
>> then making arrays "64 bit capable" would also require
>> collections to become "64 bit capable".
>>
>> And then there is serialization between 32 and 64 bit systems.
>>
>> It will not be a simple change.
>
> No one said it would. But it's well within the "normal" in terms of the
> kind of development resources that are invested in .NET. Lots of new
> features in .NET and C# have appeared that require much more work than
> enabling object allocation sizes larger than 2GB and array/collection sizes
> greater than 2^32-1 (noting that the two are related, but independent
> issues).

MS could add this stuff in relative few hours.

But the effort by MS is just a tiny fraction of the problem.

Arne


Jan Kotas

unread,
Apr 18, 2012, 6:11:09 PM4/18/12
to
.NET Framework 4.5 is adding support for 2GB+ arrays: http://msdn.microsoft.com/en-us/library/hh285054(v=vs.110).aspx

> On Thursday, February 09, 2012 8:22 AM Clive Tooth wrote:

> This works ok...
> const int n = 500000000;
> int[] a = new int[n];
> a[1] = 2;
>
> This fails with "Exception of type 'System.OutOfMemoryException' was
> thrown."...
> const int n = 600000000;
> int[] a = new int[n];
> a[1] = 2;
>
> There seems to be a limit of 2GB on any array.
>
>
> My computer has 16GB of RAM.
>
> I am using Visual C# 2010 Express, Windows 7 Home Premium Version 6.1
> (build 7601: Service Pack 1)
>
> Microsoft Visual Studio 2010
> Version 10.0.30319.1 RTMRel
> Microsoft .NET Framework
> Version 4.0.30319 RTMRel
>
> --
> Clive Tooth


>> On Thursday, February 09, 2012 9:23 AM Marcel_Müller wrote:

>> On 09.02.2012 14:22, Clive Tooth wrote:
>>
>> No .NET object can exceed 2GB size. The allocator does not support this.
>> Because of some internal overhead, the practical maximum is somewhat less.
>>
>>
>> I would not recommend to allocate that large memory in one block for
>> several reasons.
>> Dealing with fragmentation in objects of that dimension will add no
>> significant overhead. But dealing with address space fragmentation will
>> be a serious point. There are BigArray implementations available.
>>
>>
>> This does not count, since the allocation is done in virtual memory. As
>> long as you do not access all these elements, most operating systems
>> will not associate physical memory with the address space. So the
>> important point is that you have a 64 bit platform and a 64 bit runtime.
>>
>>
>> Marcel


>>> On Thursday, February 09, 2012 10:16 AM Peter Duniho wrote:

>>> Address space fragmentation is not an issue on 64-bit Windows. Not for a
>>> "small" array as in this example, anyway.
>>>
>>> If the OP is trying to allocate such a large object on 32-bit Windows, then
>>> yes...that is a bad idea. it is not even possible, due to the 2GB
>>> per-process limit on virtual address space, but of course even large
>>> allocations that are possible would be a bad idea.
>>>
>>> But presumably this is for a 64-bit process running on 64-bit Windows. A
>>> 2GB allocation is really no big deal in that environment. There is a
>>> ginormous amount of virtual address space to work with in that case (to use
>>> the technical term :) ).
>>>
>>>
>>> Yes, and it _ought_ to work. The .NET array types even support 64-bit
>>> array lengths (LongLength property) in theory. The problem is that you
>>> cannot get .NET itself to allocate an array that large.
>>>
>>> it is one of my pet peeves: 64-bit programs in .NET are not really fully
>>> 64-bits. You get the advantages of the larger address space and higher
>>> per-process memory limits, but individual objects are still limited in size
>>> (as discussed here). it is very annoying. :(
>>>
>>> Pete


>>>> On Thursday, February 09, 2012 4:21 PM Matt wrote:

>>>> s?
>>>> a
>>>> en
>>>> =A0A
>>>> se
>>>> .
>>>>
>>>> That's correct, it was a conscious decision on the part of the
>>>> designers.
>>>> You can read one of the guy's motivations here:
>>>>
>>>> http://blogs.msdn.com/b/joshwil/archive/2005/08/10/450202.aspx
>>>>
>>>> It also shows a pretty nice way to get around the problem :)
>>>>
>>>> ze
>>>>
>>>> I agree, and likely it will be fixed in a future release. However,
>>>> realistically,
>>>> how often do you really want to be able to allocate 2G of memory in a
>>>> single
>>>> block? (And, as I type this, I can hear Bill Gates chuckling about his
>>>> 256k
>>>> comment).
>>>>
>>>> matt


>>>>> On Thursday, February 09, 2012 8:46 PM Peter Duniho wrote:

>>>>> Well, I do not think anyone was suggesting that .NET left large object
>>>>> allocations out accidentally! :)
>>>>>
>>>>>
>>>>> I think all three options are hacks, relative to those people who actually
>>>>> need large objects to be allocated. Yes, they work. And no, they are not
>>>>> terribly complicated. But I'd use the word "nice" only inasmuch as we
>>>>> acknowledge that the need for the work-arounds in the first place is
>>>>> unfortunate.
>>>>>
>>>>>
>>>>> This, I doubt. The article you referenced hedged a bit, but note that it
>>>>> was written prior to the actual release of v2.0. 4.5 is about to come out,
>>>>> and we still do not see any change in this particular issue.
>>>>>
>>>>> I am not holding my breath.
>>>>>
>>>>>
>>>>> "How often" is the wrong question. Much of any framework, including .NET,
>>>>> would be exclude if that was the metric.
>>>>>
>>>>> "How useful" is more relevant, and there is a significant subset of
>>>>> computing today in which handling large data sets is in fact useful.
>>>>>
>>>>> Pete


>>>>>> On Friday, February 10, 2012 3:57 AM Marcel Müller wrote:

>>>>>> On 10.02.2012 02:46, Peter Duniho wrote:
>>>>>>
>>>>>> Most likely it is an optimization.
>>>>>>
>>>>>>
>>>>>>
>>>>>> It is not a question of the .NET version. It is only a question of the
>>>>>> VM. E.g. the Mono .NET VM does not have this restriction on 64 bit hosts.
>>>>>>
>>>>>>
>>>>>> Marcel


>>>>>>> On Friday, February 10, 2012 12:49 PM Peter Duniho wrote:

>>>>>>> I do not understand that statement. The CLR implementation for .NET is
>>>>>>> exactly dependent on the version.
>>>>>>>
>>>>>>>
>>>>>>> Mono is not even .NET. it is a "completely different" (not counting the code
>>>>>>> base it stated from) implementation of the CLR. it is not relevant at all
>>>>>>> to the question at hand, of whether the .NET implementation of the CLR will
>>>>>>> ever support allocations > 2GB.
>>>>>>>
>>>>>>> Pete


>>>>>>>> On Tuesday, February 14, 2012 9:19 PM Arne Vajhøj wrote:

>>>>>>>> On 2/9/2012 10:16 AM, Peter Duniho wrote:
>>>>>>>>
>>>>>>>> It does not matter much that LongLength exist. If it did not
>>>>>>>> it could be added.
>>>>>>>>
>>>>>>>> The question is what to do with all the code that uses
>>>>>>>> Length.
>>>>>>>>
>>>>>>>> And because arrays and collections interact in many ways
>>>>>>>> then making arrays "64 bit capable" would also require
>>>>>>>> collections to become "64 bit capable".
>>>>>>>>
>>>>>>>> And then there is serialization between 32 and 64 bit systems.
>>>>>>>>
>>>>>>>> It will not be a simple change.
>>>>>>>>
>>>>>>>> Arne


>>>>>>>>> On Tuesday, February 14, 2012 9:23 PM Arne Vajhøj wrote:

>>>>>>>>> On 2/10/2012 3:57 AM, Marcel M??ller wrote:
>>>>>>>>>
>>>>>>>>> AFAIK then Mono does not officially support Windows in 64 bit mode.
>>>>>>>>>
>>>>>>>>> Arne


>>>>>>>>>> On Tuesday, February 14, 2012 11:24 PM Peter Duniho wrote:

>>>>>>>>>> It "matters" inasmuch as it suggests someone was actually thinking about
>>>>>>>>>> array lengths > 2^32-1 elements, but did not go "all the way". Which was
>>>>>>>>>> the point of what I wrote.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Such code is still limited to 2^32-1 elements. But first, note that the
>>>>>>>>>> limit is 2GB objects. You can have (many) fewer than 2^32-1 elements and
>>>>>>>>>> still take advantage of an object size larger than 2GB.
>>>>>>>>>>
>>>>>>>>>> So you are not really talking about the same thing that the OP is asking
>>>>>>>>>> about.
>>>>>>>>>>
>>>>>>>>>> Second, code that was limited to using the Length property would simply be
>>>>>>>>>> restricted to arrays with fewer than 2^32-1 elements. How best to enforce
>>>>>>>>>> this is debatable, but IMHO the best way would be to throw an exception if
>>>>>>>>>> the Length property is used for an array where the length is larger than an
>>>>>>>>>> int value can store.
>>>>>>>>>>
>>>>>>>>>> Most code would "just work". After all, it is unlikely (not impossible, but
>>>>>>>>>> not likely) an array would wind up with too many elements for the Length
>>>>>>>>>> property to be valid if the code was not already aware of larger arrays.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> No one said it would. But it is well within the "normal" in terms of the
>>>>>>>>>> kind of development resources that are invested in .NET. Lots of new
>>>>>>>>>> features in .NET and C# have appeared that require much more work than
>>>>>>>>>> enabling object allocation sizes larger than 2GB and array/collection sizes
>>>>>>>>>> greater than 2^32-1 (noting that the two are related, but independent
>>>>>>>>>> issues).
>>>>>>>>>>
>>>>>>>>>> Pete
>>>>>>>>>>>> On Thursday, February 16, 2012 9:04 PM Arne Vajhøj wrote:

>>>>>>>>>>>> On 2/14/2012 11:24 PM, Peter Duniho wrote:
>>>>>>>>>>>>
>>>>>>>>>>>> Solving 1/10000 of the problem is not really solving anything.
>>>>>>>>>>>>
>>>>>>>>>>>> Not only did they no go all the way - they made a step of
>>>>>>>>>>>> 0.01 mm forward.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> No.
>>>>>>>>>>>>
>>>>>>>>>>>> But given that I did not quote anything from original post, then
>>>>>>>>>>>> that should not surprise anyone.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> I completely agree that exception would be necessary.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> No. Very likely.
>>>>>>>>>>>>
>>>>>>>>>>>> All libraries that take arrays or collections as input.
>>>>>>>>>>>>
>>>>>>>>>>>>

Jeff Johnson

unread,
Apr 20, 2012, 12:12:25 PM4/20/12
to


"Matt" wrote in message
news:72bb5f78-5008-406c...@1g2000yqv.googlegroups.com...

> (And, as I type this, I can hear Bill Gates chuckling about his 256k
> comment).

A) It was 640K.

B) There's no real proof that Gates actually said this, but it lives on in
apocrypha....

Radu Cosoveanu

unread,
May 3, 2012, 9:19:41 AM5/3/12
to
As you know, on 32 bits, the max addressing limit is 2 GB.
There are 2 solutions:
- using memory as bulk and addressing how much you want, if you really want to go on this direction, which unfortunatelly is not the good one.
- change solution using a storage system or a database.
Why do you need so much memory ?

Kind regards,
Radu Cosoveanu.

0 new messages