Hi, we run NodeJS in production and I want to ask some performance
questions.
We use Ubuntu Server AMD64. We run multiple, small non-NodeJS
binaries which we have compiled as 32 bit executables. Our servers
have enough cores/processes that system RAM would be exhausted if each
malloc'd its maximum 2GiB of addresss pace - i.e. the 2GiB/process
address space limit is not an issue.
Is there any advantage to using 32 bit NodeJS on a 64 bit
architecture, when running ten or more NodeJS processes behind pound?
Are there benchmarks?
I'm asking this because our other binaries use less memory and run
slightly faster as 32 bit executables.
http://www.tuxradar.com/content/ubuntu-904-32-bit-vs-64-bit-benchmarks
hints the opposite - that V8 performance may be worse as a 32 bit
binary.
Thanks,
Chris.
P.S. What would the impact on npm be? Would it be better to maintain
a 32 bit (VM) system for NodeJS/npm builds, and ship all binaries
(NodeJS and npm C++ packages) wholesale onto the 64 bit servers?
P.P.S. getlibs: http://ubuntuforums.org/showthread.php?t=474790
--
You received this message because you are subscribed to the Google Groups "nodejs" group.
To post to this group, send email to nod...@googlegroups.com.
To unsubscribe from this group, send email to nodejs+un...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/nodejs?hl=en.
V8 3.0-3.1 is much slower on 64bit than on 32bit machines due to the fact that crankshaft optimizer is disabled. However if you recompile node.js with V8 3.2.x (it's easy ans works fine) you will have a huge speedup.
--
My _personal_ opinion is that nobody should ever have more then 100 mb
of objects in JS heap. Everything beyond that should go to native heap
which is bounded only by OS/hardware limitations.
--
Vyacheslav Egorov
Matt, limit for v8 heap is 1.7-1.9 gb on x64 (1gb on ia32) now. But
your server will start suffering noticeable full GC pauses before
hitting this limit.
My _personal_ opinion is that nobody should ever have more then 100 mb
of objects in JS heap. Everything beyond that should go to native heap
which is bounded only by OS/hardware limitations.
It might not make sense in your application (everyone here thinks "Node === Web" and can't see outside of that box). There are a lot of servers out there that might need more.
I do not talk about 'Node === Web' (which is partly true :) ). However I think you need to take the entire sentence. "Since Javascript is single-threaded it makes no sense having a large v8 heapsize"See this from the same article:'His response, which this whole post was lead-in for, is that when you have that much memory it's difficult to actually make use of all of it from a language like JavaScript because it's single-threaded. That is, the time it takes to read in and out (or more importantly, process) that much data starts making apps that would use more data unuseful.'
On Thu, Apr 21, 2011 at 8:00 PM, Matt <hel...@gmail.com> wrote:It might not make sense in your application (everyone here thinks "Node === Web" and can't see outside of that box). There are a lot of servers out there that might need more.That was why I was asking :) What situations are those?
On Thu, Apr 21, 2011 at 9:08 AM, Joe Developer <joe.d.d...@gmail.com> wrote:On Thu, Apr 21, 2011 at 8:00 PM, Matt <hel...@gmail.com> wrote:It might not make sense in your application (everyone here thinks "Node === Web" and can't see outside of that box). There are a lot of servers out there that might need more.That was why I was asking :) What situations are those?Honestly there's a gazillion...How about a mail server serving 100k concurrent connections?A web server with a two-level cache (local memory then memcached)?Just about anything related to finance.A large in-memory trie structure that needs to be accessed via a network request.An IPv6 blocklist.
I can keep going, but these kind of memory restrictions only make sense in a web browser IMHO.
Matt.
On Thu, Apr 21, 2011 at 9:03 PM, Matt <hel...@gmail.com> wrote:On Thu, Apr 21, 2011 at 9:08 AM, Joe Developer <joe.d.d...@gmail.com> wrote:On Thu, Apr 21, 2011 at 8:00 PM, Matt <hel...@gmail.com> wrote:It might not make sense in your application (everyone here thinks "Node === Web" and can't see outside of that box). There are a lot of servers out there that might need more.That was why I was asking :) What situations are those?Honestly there's a gazillion...How about a mail server serving 100k concurrent connections?A web server with a two-level cache (local memory then memcached)?Just about anything related to finance.A large in-memory trie structure that needs to be accessed via a network request.An IPv6 blocklist.I must admit that I don't find keeping the data in-process particularly appealing for any of those.I am not arguing that the limit should be sought retained, but I think that in-process memory on that scale seems fairly odd.
-Louis
My _personal_ opinion is that nobody should ever have more then 100 mb
of objects in JS heap. Everything beyond that should go to native heap
which is bounded only by OS/hardware limitations.
--
Vyacheslav Egorov
100 mb might a bit harsh... I just find the idea of large in-memory
storage engines built in pure JS a bit disturbing.
--
Vyacheslav Egorov
The x86 V8 is more mature than the x64 one. In Node v0.4 (which
corresponds to V8 3.1) the runtime optimization engine, crankshaft, is
enabled by default on x86 but disabled by default on x64. (You can
enable crankshaft on x64 by doing using the --crankshaft command line
argument.) You'll get better performance out of x86.
The sky-high abstractions that the JVM and V8 provide like garbage
collection are not appropriate for programs that need many gigabytes
of memory.
There are plenty of apps written for the JVM that use gigabytes of memory.
100 mb might a bit harsh... I just find the idea of large in-memory
storage engines built in pure JS a bit disturbing.