.NET Framework 4.5 is adding support for 2GB+ arrays:
http://msdn.microsoft.com/en-us/library/hh285054(v=vs.110).aspx
> On Thursday, February 09, 2012 8:22 AM Clive Tooth wrote:
> This works ok...
> const int n = 500000000;
> int[] a = new int[n];
> a[1] = 2;
>
> This fails with "Exception of type 'System.OutOfMemoryException' was
> thrown."...
> const int n = 600000000;
> int[] a = new int[n];
> a[1] = 2;
>
> There seems to be a limit of 2GB on any array.
>
>
> My computer has 16GB of RAM.
>
> I am using Visual C# 2010 Express, Windows 7 Home Premium Version 6.1
> (build 7601: Service Pack 1)
>
> Microsoft Visual Studio 2010
> Version 10.0.30319.1 RTMRel
> Microsoft .NET Framework
> Version 4.0.30319 RTMRel
>
> --
> Clive Tooth
>> On Thursday, February 09, 2012 9:23 AM Marcel_Müller wrote:
>> On 09.02.2012 14:22, Clive Tooth wrote:
>>
>> No .NET object can exceed 2GB size. The allocator does not support this.
>> Because of some internal overhead, the practical maximum is somewhat less.
>>
>>
>> I would not recommend to allocate that large memory in one block for
>> several reasons.
>> Dealing with fragmentation in objects of that dimension will add no
>> significant overhead. But dealing with address space fragmentation will
>> be a serious point. There are BigArray implementations available.
>>
>>
>> This does not count, since the allocation is done in virtual memory. As
>> long as you do not access all these elements, most operating systems
>> will not associate physical memory with the address space. So the
>> important point is that you have a 64 bit platform and a 64 bit runtime.
>>
>>
>> Marcel
>>> On Thursday, February 09, 2012 10:16 AM Peter Duniho wrote:
>>> Address space fragmentation is not an issue on 64-bit Windows. Not for a
>>> "small" array as in this example, anyway.
>>>
>>> If the OP is trying to allocate such a large object on 32-bit Windows, then
>>> yes...that is a bad idea. it is not even possible, due to the 2GB
>>> per-process limit on virtual address space, but of course even large
>>> allocations that are possible would be a bad idea.
>>>
>>> But presumably this is for a 64-bit process running on 64-bit Windows. A
>>> 2GB allocation is really no big deal in that environment. There is a
>>> ginormous amount of virtual address space to work with in that case (to use
>>> the technical term :) ).
>>>
>>>
>>> Yes, and it _ought_ to work. The .NET array types even support 64-bit
>>> array lengths (LongLength property) in theory. The problem is that you
>>> cannot get .NET itself to allocate an array that large.
>>>
>>> it is one of my pet peeves: 64-bit programs in .NET are not really fully
>>> 64-bits. You get the advantages of the larger address space and higher
>>> per-process memory limits, but individual objects are still limited in size
>>> (as discussed here). it is very annoying. :(
>>>
>>> Pete
>>>> On Thursday, February 09, 2012 4:21 PM Matt wrote:
>>>> s?
>>>> a
>>>> en
>>>> =A0A
>>>> se
>>>> .
>>>>
>>>> That's correct, it was a conscious decision on the part of the
>>>> designers.
>>>> ze
>>>>
>>>> I agree, and likely it will be fixed in a future release. However,
>>>> realistically,
>>>> how often do you really want to be able to allocate 2G of memory in a
>>>> single
>>>> block? (And, as I type this, I can hear Bill Gates chuckling about his
>>>> 256k
>>>> comment).
>>>>
>>>> matt
>>>>> On Thursday, February 09, 2012 8:46 PM Peter Duniho wrote:
>>>>> Well, I do not think anyone was suggesting that .NET left large object
>>>>> allocations out accidentally! :)
>>>>>
>>>>>
>>>>> I think all three options are hacks, relative to those people who actually
>>>>> need large objects to be allocated. Yes, they work. And no, they are not
>>>>> terribly complicated. But I'd use the word "nice" only inasmuch as we
>>>>> acknowledge that the need for the work-arounds in the first place is
>>>>> unfortunate.
>>>>>
>>>>>
>>>>> This, I doubt. The article you referenced hedged a bit, but note that it
>>>>> was written prior to the actual release of v2.0. 4.5 is about to come out,
>>>>> and we still do not see any change in this particular issue.
>>>>>
>>>>> I am not holding my breath.
>>>>>
>>>>>
>>>>> "How often" is the wrong question. Much of any framework, including .NET,
>>>>> would be exclude if that was the metric.
>>>>>
>>>>> "How useful" is more relevant, and there is a significant subset of
>>>>> computing today in which handling large data sets is in fact useful.
>>>>>
>>>>> Pete
>>>>>> On Friday, February 10, 2012 3:57 AM Marcel Müller wrote:
>>>>>> On 10.02.2012 02:46, Peter Duniho wrote:
>>>>>>
>>>>>> Most likely it is an optimization.
>>>>>>
>>>>>>
>>>>>>
>>>>>> It is not a question of the .NET version. It is only a question of the
>>>>>> VM. E.g. the Mono .NET VM does not have this restriction on 64 bit hosts.
>>>>>>
>>>>>>
>>>>>> Marcel
>>>>>>> On Friday, February 10, 2012 12:49 PM Peter Duniho wrote:
>>>>>>> I do not understand that statement. The CLR implementation for .NET is
>>>>>>> exactly dependent on the version.
>>>>>>>
>>>>>>>
>>>>>>> Mono is not even .NET. it is a "completely different" (not counting the code
>>>>>>> base it stated from) implementation of the CLR. it is not relevant at all
>>>>>>> to the question at hand, of whether the .NET implementation of the CLR will
>>>>>>> ever support allocations > 2GB.
>>>>>>>
>>>>>>> Pete
>>>>>>>> On Tuesday, February 14, 2012 9:19 PM Arne Vajhøj wrote:
>>>>>>>> On 2/9/2012 10:16 AM, Peter Duniho wrote:
>>>>>>>>
>>>>>>>> It does not matter much that LongLength exist. If it did not
>>>>>>>> it could be added.
>>>>>>>>
>>>>>>>> The question is what to do with all the code that uses
>>>>>>>> Length.
>>>>>>>>
>>>>>>>> And because arrays and collections interact in many ways
>>>>>>>> then making arrays "64 bit capable" would also require
>>>>>>>> collections to become "64 bit capable".
>>>>>>>>
>>>>>>>> And then there is serialization between 32 and 64 bit systems.
>>>>>>>>
>>>>>>>> It will not be a simple change.
>>>>>>>>
>>>>>>>> Arne
>>>>>>>>> On Tuesday, February 14, 2012 9:23 PM Arne Vajhøj wrote:
>>>>>>>>> On 2/10/2012 3:57 AM, Marcel M??ller wrote:
>>>>>>>>>
>>>>>>>>> AFAIK then Mono does not officially support Windows in 64 bit mode.
>>>>>>>>>
>>>>>>>>> Arne
>>>>>>>>>> On Tuesday, February 14, 2012 11:24 PM Peter Duniho wrote:
>>>>>>>>>> It "matters" inasmuch as it suggests someone was actually thinking about
>>>>>>>>>> array lengths > 2^32-1 elements, but did not go "all the way". Which was
>>>>>>>>>> the point of what I wrote.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Such code is still limited to 2^32-1 elements. But first, note that the
>>>>>>>>>> limit is 2GB objects. You can have (many) fewer than 2^32-1 elements and
>>>>>>>>>> still take advantage of an object size larger than 2GB.
>>>>>>>>>>
>>>>>>>>>> So you are not really talking about the same thing that the OP is asking
>>>>>>>>>> about.
>>>>>>>>>>
>>>>>>>>>> Second, code that was limited to using the Length property would simply be
>>>>>>>>>> restricted to arrays with fewer than 2^32-1 elements. How best to enforce
>>>>>>>>>> this is debatable, but IMHO the best way would be to throw an exception if
>>>>>>>>>> the Length property is used for an array where the length is larger than an
>>>>>>>>>> int value can store.
>>>>>>>>>>
>>>>>>>>>> Most code would "just work". After all, it is unlikely (not impossible, but
>>>>>>>>>> not likely) an array would wind up with too many elements for the Length
>>>>>>>>>> property to be valid if the code was not already aware of larger arrays.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> No one said it would. But it is well within the "normal" in terms of the
>>>>>>>>>> kind of development resources that are invested in .NET. Lots of new
>>>>>>>>>> features in .NET and C# have appeared that require much more work than
>>>>>>>>>> enabling object allocation sizes larger than 2GB and array/collection sizes
>>>>>>>>>> greater than 2^32-1 (noting that the two are related, but independent
>>>>>>>>>> issues).
>>>>>>>>>>
>>>>>>>>>> Pete
>>>>>>>>>>>> On Thursday, February 16, 2012 9:04 PM Arne Vajhøj wrote:
>>>>>>>>>>>> On 2/14/2012 11:24 PM, Peter Duniho wrote:
>>>>>>>>>>>>
>>>>>>>>>>>> Solving 1/10000 of the problem is not really solving anything.
>>>>>>>>>>>>
>>>>>>>>>>>> Not only did they no go all the way - they made a step of
>>>>>>>>>>>> 0.01 mm forward.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> No.
>>>>>>>>>>>>
>>>>>>>>>>>> But given that I did not quote anything from original post, then
>>>>>>>>>>>> that should not surprise anyone.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> I completely agree that exception would be necessary.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> No. Very likely.
>>>>>>>>>>>>
>>>>>>>>>>>> All libraries that take arrays or collections as input.
>>>>>>>>>>>>
>>>>>>>>>>>>