--max-old-space-size option to increase memory limit to 1900MB

6,490 views
Skip to first unread message

gitfy

unread,
Aug 9, 2011, 7:22:15 PM8/9/11
to nodejs
Hi,
I am facing memory problems with our server, when the server process
hit the 1gb limit, the process hogs the CPU for long time and crashes
down with "FATAL ERROR: JS Allocation failed - process out of memory".

I searched around and found the option --max-old-space-size=1900 to
increase the memory limit. But even after provide this by

node --max-old-space-size=1900 server.js

when the process hits 1gb, it hogs and stalls and nothing happens and
eventually it died.

Is there a different way to specific this options so that the server
process can use memory more than 1gb.

any help is appreciated.
Thanks.

Marak Squires

unread,
Aug 9, 2011, 11:34:16 PM8/9/11
to nod...@googlegroups.com
To my knowledge, there is a hard limit in v8 which you cannot bypass. See: http://groups.google.com/group/nodejs/browse_thread/thread/5597be34a3f0a7b9

Two questions come up:

1. Are you certain that your Ram usage is "valid"? Perhaps you are leaking somewhere? How high is your load when you hit this limit?

2. Is it possible to spread out your logic across multiple nodes?

- Marak

--
Job Board: http://jobs.nodejs.org/
Posting guidelines: https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to nod...@googlegroups.com
To unsubscribe from this group, send email to
nodejs+un...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en

gitfy

unread,
Aug 10, 2011, 12:22:17 AM8/10/11
to nodejs
I found the issue with the parameter.

I was running the node process in a cluster, the cluster API at the
moment doesn't pass this parameter when the child workers are spawned.
Did some work around and was able to get the option working where i
could see the memory growing.

Marak to questions on the scenario :

We are doing heavy batch load of data, where there is a node process
which acts as a client and posting data into the server using the REST
API. We are trying to use this batch uplaod as a way to test the
server process how it behave under heavy loads.

There are situations where the memory could be high based on the
amount of data processed. We are also in the process of tuning the
volume based on how much a node process can handle.

But considering the fact that node is only able to handle a max of
1.9gb is kind of too low from a server perspective.

I am trying to spread the load and that is the reason i started to use
your hook.io API.

On Aug 9, 8:34 pm, Marak Squires <marak.squi...@gmail.com> wrote:
> To my knowledge, there is a hard limit in v8 which you cannot bypass. See:http://groups.google.com/group/nodejs/browse_thread/thread/5597be34a3...

Marak Squires

unread,
Aug 10, 2011, 12:25:35 AM8/10/11
to nod...@googlegroups.com
Roger that!

 :-)

Ben Noordhuis

unread,
Aug 10, 2011, 7:48:28 AM8/10/11
to nod...@googlegroups.com
On Wed, Aug 10, 2011 at 06:22, gitfy <lenin....@gmail.com> wrote:
> We are doing heavy batch load of data, where there is a node process
> which acts as a client and posting data into the server using the REST
> API. We are trying to use this batch uplaod as a way to test the
> server process how it behave under heavy loads.
>
> There are situations where the memory could be high based on the
> amount of data processed. We are also in the process of tuning the
> volume based on how much a node process  can handle.
>
> But considering the fact that node is only able to handle a max of
> 1.9gb is kind of too low from a server perspective.

V8's heap size is limited to 1 GB but Buffer objects exist largely
outside the heap. With large uploads, the bulk of your data lives in
buffers. It should hardly be touching the heap.

gitfy

unread,
Aug 10, 2011, 10:03:25 AM8/10/11
to nodejs
When i meant upload, it is not the upload stream data as such. I mean
the number of objects we might be created during the upload process
(like database records etc).
This could go well over 2gb even depending on the processing.

On Aug 10, 4:48 am, Ben Noordhuis <i...@bnoordhuis.nl> wrote:

Jann Horn

unread,
Aug 10, 2011, 10:26:22 AM8/10/11
to nod...@googlegroups.com
2011/8/10 gitfy <lenin....@gmail.com>:

> When i meant upload, it is not the upload stream data as such. I mean
> the number of objects we might be created during the upload process
> (like database records etc).
> This could go well over 2gb even depending on the processing.

Would it help you to be able to process the upload as a stream of
objects? Could you just execute one function on each object and then
forget it or so? In that case, have a look at my halfstreamxml package
(for JSON, something similar should be possible, too):
https://github.com/thejh/node-halfstreamxml

Reply all
Reply to author
Forward
0 new messages