Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Total malloc limit of 2 gigabytes per process?

2 views
Skip to first unread message

dh...@my-dejanews.com

unread,
Mar 17, 1999, 3:00:00 AM3/17/99
to
I have a machine with 2 gigabytes of RAM. Total virtual memory available is
5 gigabytes. However I am unable to allocate a total of more than 2
gigabytes (with one process). I've included source code
below that continually allocates memory of given chunk sizes until malloc
fails. I've never been able to get this to go past the 2 gig limit. Any
help is appreciated. BTW, I'm using Windows NT 4.0 service pack 3 and VC++
version 5.0.

#include <stdio.h>
#include <stdlib.h>

#define MAX_MEM_SIZES 20

static int Mem_size[ MAX_MEM_SIZES ];
static int Num_mem_sizes;

void main( int argc, char *argv[] )
{
int i;
int j;

if( argc == 1 )
{
fprintf( stderr, "You must specify a malloc chunk size.\n" );
exit(-1);
}

Num_mem_sizes = argc-1;
for( i=1; i < argc; ++i )
{
Mem_size[ i-1 ] = atoi( argv[i] );
}

for( i=0; i < Num_mem_sizes; ++i )
{
for( j=0; ; ++j )
{
if( malloc( Mem_size[ i ] ) == NULL ) break;

if( !(j%1000) )
{
printf( "\rMalloced %10d chunks of %8d bytes.", j,
Mem_size[i] );
}
}
printf( "\rMalloced %10d chunks of %8d bytes.\n", j, Mem_size[i] );
fflush(stdout);
}
}

-----------== Posted via Deja News, The Discussion Network ==----------
http://www.dejanews.com/ Search, Read, Discuss, or Start Your Own

Andy Moreton

unread,
Mar 17, 1999, 3:00:00 AM3/17/99
to
In article <7cosgu$rs2$1...@nnrp1.dejanews.com>, dh...@my-dejanews.com wrote:
>I have a machine with 2 gigabytes of RAM. Total virtual memory available is
>5 gigabytes. However I am unable to allocate a total of more than 2
>gigabytes (with one process). I've included source code
>below that continually allocates memory of given chunk sizes until malloc
>fails. I've never been able to get this to go past the 2 gig limit. Any
>help is appreciated. BTW, I'm using Windows NT 4.0 service pack 3 and VC++
>version 5.0.

This is simple to answer.

In Windows NT all user processes have their own (0 - 4GB) virtual address
space. The address spave is partitioned into :-
User Space - your process
System space - address space used by the OS and device drivers

In NT4.0 Workstation and Server,
User space is 0 - 2GB System space is 2GB - 4GB

In NT4.0 Server Enterprise Edition
User space is 0 - 3GB System space is 3GB - 4GB


So your problem is that you have run of address space, not memory. I suggest
that if you require such a large memory allocation you:-
a) Don't do it - use a better algorithm for your processing
b) Use another platform :-)

AndyM

--
Andy Moreton
Virata Ltd http://www.virata.com/
Mount Pleasant House, Huntingdon Road, Cambridge CB3 0BL, UK
Tel: +44 1223 566919 Fax: +44 1223 566915

Pete Delgado

unread,
Mar 17, 1999, 3:00:00 AM3/17/99
to
Read the Windows documentation again... I think it will
become very clear...

Of course, I suspect that this is a troll...

-Pete

dh...@my-dejanews.com wrote in message <7cosgu$rs2$1...@nnrp1.dejanews.com>...

Richard Heathfield

unread,
Mar 17, 1999, 3:00:00 AM3/17/99
to
Ahem.

2097152 kilobytes ought to be enough for anybody (to coin a phrase).

--
Richard H

#include "sig.h"


dh...@my-dejanews.com wrote in article <7cosgu$rs2$1...@nnrp1.dejanews.com>...


> I have a machine with 2 gigabytes of RAM. Total virtual memory available
is
> 5 gigabytes. However I am unable to allocate a total of more than 2
> gigabytes (with one process). I've included source code

<snip>

Jerry Coffin

unread,
Mar 17, 1999, 3:00:00 AM3/17/99
to
says...

> I have a machine with 2 gigabytes of RAM. Total virtual memory available is
> 5 gigabytes. However I am unable to allocate a total of more than 2
> gigabytes (with one process). I've included source code
> below that continually allocates memory of given chunk sizes until malloc
> fails. I've never been able to get this to go past the 2 gig limit. Any
> help is appreciated. BTW, I'm using Windows NT 4.0 service pack 3 and VC++
> version 5.0.

This is a documented limit in NT 4. If you're using the Enterprise
Edition of NT Server, you can increase that to 3 Gig available in a
process.

Beyond that, they're doing some work in NT 5 (aka Win2k) to allow
programs to work with larger amounts of memory. In a future, 64-bit
version of Windows, there's the intent to integrate this into the
system, so any program for that version will work with large amounts
of memory automatically. That, however, is sometime in the relatively
distant future. (The "near future" in this case meaning it's in beta
test, the relatively distant future meaning I, at least, haven't seen
anything like working code yet.)

Chris Marriott

unread,
Mar 17, 1999, 3:00:00 AM3/17/99
to

dh...@my-dejanews.com wrote in message <7cosgu$rs2$1...@nnrp1.dejanews.com>...
>I have a machine with 2 gigabytes of RAM. Total virtual memory available
is
>5 gigabytes. However I am unable to allocate a total of more than 2
>gigabytes (with one process).

That's what one would expect. Each process runs in a 4GB address space; 2GB
of that address space is yours, the other 2GB is owned by the operating
system.

Chris
-----------------------------------------------------------------------
Chris Marriott, SkyMap Software, UK (ch...@skymap.com)
Visit our web site at http://www.skymap.com
Astronomy software written by astronomers, for astronomers

Harold Weissfield

unread,
Mar 17, 1999, 3:00:00 AM3/17/99
to
I loved hearing that Bill Gates quote myself!

-HaroldW

Richard Heathfield wrote in message <01be70b5$1b51f4e0$2a01a8c0@arc7>...

Robert Schlabbach

unread,
Mar 17, 1999, 3:00:00 AM3/17/99
to
Jerry Coffin wrote in message ...

>Beyond that, they're doing some work in NT 5 (aka Win2k) to allow
>programs to work with larger amounts of memory. In a future, 64-bit
>version of Windows, there's the intent to integrate this into the
>system, so any program for that version will work with large amounts
>of memory automatically. That, however, is sometime in the relatively
>distant future. (The "near future" in this case meaning it's in beta
>test, the relatively distant future meaning I, at least, haven't seen
>anything like working code yet.)

But the _documentation_ is already available. Microsoft calls this "VLM"
and offers APIs like VirtualAllocVlm, VirtualFreeVlm, CopyMemoryVlm, etc to
support it. Also, the requirements for VLM are already outlined:

"- VLM is supported only on processors that directly support native 64-bit
addressing. At present, these include the Compaq DIGITAL AlphaServer
processors and specifically exclude the Intel 80286, 80386, 80486, Pentium,
and Pentium II processors.
- VLM is supported only on computers that have at least 128 megabytes (MB)
of physical memory.
<snip>
- VLM is available on Windows NT Server Enterprise Edition, version 5.0."

So it WON'T be available on any existing Intel CPU, and the OS required
will now be named "Windows 2000 Advanced Server", I think...

Regards,
--
Robert Schlabbach
e-mail: rob...@powerstation.isdn.cs.TU-Berlin.DE
Technical University of Berlin, Germany


Lawrence Kirby

unread,
Mar 17, 1999, 3:00:00 AM3/17/99
to
In article <MPG.1159be601...@news.rmi.net>
jco...@taeus.com "Jerry Coffin" writes:

>In article <7cosgu$rs2$1...@nnrp1.dejanews.com>, dh...@my-dejanews.com
>says...

>> I have a machine with 2 gigabytes of RAM. Total virtual memory available is
>> 5 gigabytes. However I am unable to allocate a total of more than 2

>> gigabytes (with one process). I've included source code
>> below that continually allocates memory of given chunk sizes until malloc
>> fails. I've never been able to get this to go past the 2 gig limit. Any
>> help is appreciated. BTW, I'm using Windows NT 4.0 service pack 3 and VC++
>> version 5.0.
>
>This is a documented limit in NT 4. If you're using the Enterprise
>Edition of NT Server, you can increase that to 3 Gig available in a
>process.
>

>Beyond that, they're doing some work in NT 5 (aka Win2k) to allow
>programs to work with larger amounts of memory.

Visions of good(!) old DOS style memory models coming to the 32 bit world.

--
-----------------------------------------
Lawrence Kirby | fr...@genesis.demon.co.uk
Wilts, England | 7073...@compuserve.com
-----------------------------------------


Jerry Coffin

unread,
Mar 17, 1999, 3:00:00 AM3/17/99
to
In article <+jq*-ea...@news.virata.com>, a...@virata.com says...

[ ... ]

> In NT4.0 Server Enterprise Edition
> User space is 0 - 3GB System space is 3GB - 4GB

Note that this is NOT automatic. You can _make_ it happen, but it
doesn't happen unless it sees the correct switch at boot time. (/3GB
in boot.ini, IIRC). IIRC, this also applies _only_ to the Intel
version of NT, not the Alpha version -- rather ironic, given that
Alphas are likely to be used more often in big, heavy-duty servers...

Andy Lutomirski

unread,
Mar 17, 1999, 3:00:00 AM3/17/99
to
Or use sections and map and unmap them as needed or use NT5 which will deal
w/ this on non-intel systems, or convince MS to let you have segments each
with several GB. Yuch!


Andy Lutomirski

unread,
Mar 17, 1999, 3:00:00 AM3/17/99
to
Or with SP3 and the /3gb flag at startup I think... see www.sysinternals.com


dh...@my-dejanews.com

unread,
Mar 18, 1999, 3:00:00 AM3/18/99
to
Thanks for the info everybody.

P.S. Wish I was a troll. In the fantasy world these limits don't exist.
However my fingers would probably be too big for coding anyway.

P.P.S. What do you mean by troll? Please send your response by
email. I don't want to clutter the newsgroup more than it already is.
Thanks.

In article <7cp1cr$c...@nfs0.sdrc.com>,


"Pete Delgado" <Peter....@sdrc.com> wrote:
> Read the Windows documentation again... I think it will
> become very clear...
>
> Of course, I suspect that this is a troll...
>
> -Pete
>

> dh...@my-dejanews.com wrote in message <7cosgu$rs2$1...@nnrp1.dejanews.com>...

> >I have a machine with 2 gigabytes of RAM. Total virtual memory available
> is
> >5 gigabytes. However I am unable to allocate a total of more than 2
> >gigabytes (with one process). I've included source code
> >below that continually allocates memory of given chunk sizes until malloc
> >fails. I've never been able to get this to go past the 2 gig limit. Any
> >help is appreciated. BTW, I'm using Windows NT 4.0 service pack 3 and VC++
> >version 5.0.
>
>

-----------== Posted via Deja News, The Discussion Network ==----------

Dann Corbit

unread,
Mar 18, 1999, 3:00:00 AM3/18/99
to
<dh...@my-dejanews.com> wrote in message
news:7crrel$aiv$1...@nnrp1.dejanews.com...

>Thanks for the info everybody.
>
>P.S. Wish I was a troll. In the fantasy world these limits don't exist.
>However my fingers would probably be too big for coding anyway.
>
>P.P.S. What do you mean by troll? Please send your response by
>email. I don't want to clutter the newsgroup more than it already is.
>Thanks.

Limits exist where we want them to and disappear when we wish them to.
Except when they don't.

Look at NT for the Alpha. Maybe you don't have to change a thing. From:
http://windows.digital.com/products/Server/8x00nt.asp

"VLM64 works by using the 64-bit architecture of Alpha to make large amounts
of fast system memory available to databases and applications."

Questions of this nature are usually best solved in a system specific group
or with a web search.
--
C-FAQ: http://www.eskimo.com/~scs/C-faq/top.html
"The C-FAQ Book" ISBN 0-201-84519-9
C.A.P. Newsgroup http://www.dejanews.com/~c_a_p
C.A.P. FAQ: ftp://38.168.214.175/pub/Chess%20Analysis%20Project%20FAQ.htm


Joe Wright

unread,
Mar 18, 1999, 3:00:00 AM3/18/99
to

With all due respect, I think this is a troll. Two Gigabytes of main
memory is a stretch in itself. Five Gigabytes of anything is suspect.
4.29 maybe. Five?

Having said that, address space and available (real/virtual) memory are
two different things. Talking about malloc() notwithstanding, this is
not a C question. Ask the OS vendor.
--
Joe Wright mailto:cons...@infi.net
"Everything should be made as simple as possible, but not simpler."
--- Albert Einstein ---

Dann Corbit

unread,
Mar 18, 1999, 3:00:00 AM3/18/99
to
Mark McDougall <ma...@vl.com.au> wrote in message
news:36F19451...@vl.com.au...

>dh...@my-dejanews.com wrote:
>
>> I have a machine with 2 gigabytes of RAM. Total virtual memory available
is
>> 5 gigabytes. However I am unable to allocate a total of more than 2
>> gigabytes (with one process).
>
>I'm curious to know exactly *why* anyone would require such large
>amounts of memory. It smacks of a 'brute-force' approach to me. Granted,
>it may be 'simpler' or 'faster' to allocate such a large chunk of
>memory, but programmers have been dealing with far more restrictive
>platform limitations than modern-day memory limits! That's all part of
>the challenge!

I have 5.9 gigabytes of chess endgame tablebase positions in a set of files.
If I had enough ram, I'd memory map them all. More memory just means that
you can tackle problems you could not tackle before. If I had 100 gigs of
ram, I would create a 34 bit Zobrist hash and [almost] never suffer a
collision for my chess lookups. A PC with 128 megs of ram was a ludicrous
concept ten years ago. Now it is pretty much a bare minimum for
programming.

You can't have too much ram. Ever.

Mark McDougall

unread,
Mar 19, 1999, 3:00:00 AM3/19/99
to
dh...@my-dejanews.com wrote:

> I have a machine with 2 gigabytes of RAM. Total virtual memory available is
> 5 gigabytes. However I am unable to allocate a total of more than 2
> gigabytes (with one process).

I'm curious to know exactly *why* anyone would require such large
amounts of memory. It smacks of a 'brute-force' approach to me. Granted,
it may be 'simpler' or 'faster' to allocate such a large chunk of
memory, but programmers have been dealing with far more restrictive
platform limitations than modern-day memory limits! That's all part of
the challenge!

So, do tell...

--
Mark McDougall |
Engineer |
Virtual Logic Pty Ltd |
<http://www.vl.com.au> |

Lawrence Kirby

unread,
Mar 19, 1999, 3:00:00 AM3/19/99
to
In article <7cs634$o1j$1...@client2.news.psi.net>
dco...@solutionsiq.com "Dann Corbit" writes:

...

>You can't have too much ram. Ever.

I recently learnt the hard way that you can have too much memory if you
try to run Windows 95 (don't ask). It seems that with anything more than
64MB (certainly with 128MB) Win95 systematically destroys itself. ;-)

Richard Heathfield

unread,
Mar 19, 1999, 3:00:00 AM3/19/99
to
Lawrence Kirby <fr...@genesis.demon.co.uk> wrote in article
<921811...@genesis.demon.co.uk>...

> In article <7cs634$o1j$1...@client2.news.psi.net>
> dco...@solutionsiq.com "Dann Corbit" writes:
>
> ...
>
> >You can't have too much ram. Ever.
>
> I recently learnt the hard way that you can have too much memory if you
> try to run Windows 95 (don't ask). It seems that with anything more than
> 64MB (certainly with 128MB) Win95 systematically destroys itself. ;-)

Shhhh! Don't tell my machine. It hos 128Mb and is runn1nq Win95 a#d is
st!ll wkoring 9erfect1y.

--
Richard Heathfield

"The bug stops here."


Lawrence Kirby

unread,
Mar 19, 1999, 3:00:00 AM3/19/99
to
In article <01be71ca$4b0e1860$2a01a8c0@arc7>
bin...@eton.powernet.co.uk "Richard Heathfield" writes:

You know it is only a matter of time. :-)

Daniel Barker

unread,
Mar 19, 1999, 3:00:00 AM3/19/99
to
On Fri, 19 Mar 1999, Mark McDougall wrote:

> dh...@my-dejanews.com wrote:
>
> > I have a machine with 2 gigabytes of RAM. Total virtual memory available is
> > 5 gigabytes. However I am unable to allocate a total of more than 2
> > gigabytes (with one process).
>
> I'm curious to know exactly *why* anyone would require such large
> amounts of memory. It smacks of a 'brute-force' approach to me. Granted,
> it may be 'simpler' or 'faster' to allocate such a large chunk of
> memory, but programmers have been dealing with far more restrictive
> platform limitations than modern-day memory limits! That's all part of
> the challenge!
>
> So, do tell...

I have heard that some meteorological simulations basically require an
enormous amount of memory. For such work, distributed-memory
supercomputers (i.e., very fast networks) have become popular because of
their large total amount of RAM, rather than for their total amount of
processing power.

I was surprised to learn this, having (very painfully) messed with a
distributed-memory system in search of integer processing power and not
really noticed the large amount of memory present. But this wasn't a
problem in meteorology. Obviously, whether a large amount of RAM is
useful or not depends on the task at hand. (Whether purpose-made
distributed-memory systems are useful, in this age of large address
spaces, symmetric multiprocessors and fast networks of desktop computers
is an interesting question though.)


Daniel Barker,
Institute of Cell and Molecular Biology,
Swann Building,
King's Buildings,
Mayfield Road,
Edinburgh
EH9 3JR
UK


L

unread,
Mar 20, 1999, 3:00:00 AM3/20/99
to
 

Dann Corbit wrote:

Mark McDougall <ma...@vl.com.au> wrote in message
news:36F19451...@vl.com.au...

>dh...@my-dejanews.com wrote:
>
>> I have a machine with 2 gigabytes of RAM.  Total virtual memory available
is
>> 5 gigabytes.  However I am unable to allocate a total of more than 2
>> gigabytes (with one process).
>
>I'm curious to know exactly *why* anyone would require such large
>amounts of memory. It smacks of a 'brute-force' approach to me. Granted,
>it may be 'simpler' or 'faster' to allocate such a large chunk of
>memory, but programmers have been dealing with far more restrictive
>platform limitations than modern-day memory limits! That's all part of
>the challenge!

I have 5.9 gigabytes of chess endgame tablebase positions in a set of files.

If I had enough ram, I'd memory map them all.  More memory just means that
you can tackle problems you could not tackle before.  If I had 100 gigs of
ram, I would create a 34 bit Zobrist hash and [almost] never suffer a
collision for my chess lookups.  A PC with 128 megs of ram was a ludicrous
concept ten years ago.  Now it is pretty much a bare minimum for
programming.
 

My question is why do programmers need "128" megs of ram to develop a program, but after it's compiled, the end user may only need 32 megs of ram to run it?  Is is just that the compilor needs lots of memory to do it's thing?

Ben Pfaff

unread,
Mar 20, 1999, 3:00:00 AM3/20/99
to
L <ERASE_THIS_...@recording.hostway.com> writes:

My question is why do programmers need "128" megs of ram to develop a program,
but after it's compiled, the end user may only need 32 megs of ram to run it?
Is is just that the compilor needs lots of memory to do it's thing?

Yes.

By the way, don't post in HTML.
--
"I ran it on my DeathStation 9000 and demons flew out of my nose." --Kaz

Please: do not email me copies of your posts to comp.lang.c
do not ask me C questions via email; post them instead

L

unread,
Mar 20, 1999, 3:00:00 AM3/20/99
to
but html is fun.

Quenya

unread,
Mar 21, 1999, 3:00:00 AM3/21/99
to
On my newsreader, your HTML postings are shown as completely blank
documents in the preview pane and in the message reader window. It appears
to all intents and purposes as if you have sent an empty message. These are
very quick to read, thankfully. But if you want feedback on your postings,
isn't it better to send them out in a format which every newsreader can
read correctly (ie straight text) ?

'Quenya'

L wrote in article <36F4700A...@recording.hostway.com>...

> but html is fun.
>
> Ben Pfaff wrote:
>

<snip>

0 new messages