Limit of 32786 concurrent connections in a nodejs server

4,361 views
Skip to first unread message

Guido García

unread,
Jul 3, 2013, 6:37:10 PM7/3/13
to nod...@googlegroups.com

I am facing an issue when trying to keep more than 32768 concurrent HTTP connections in a nodejs server running on a SmartOS machine. Once I reach that number of concurrent connections, the server starts rejecting new HTTP requests.

I have already increased the max number of file descriptors to 999999 with ulimit:

[root ~]# ulimit -a 
core file size (blocks, -c) unlimited 
data seg size (kbytes, -d) unlimited 
file size (blocks, -f) unlimited 
open files (-n) 999999 
pipe size (512 bytes, -p) 10 
stack size (kbytes, -s) 10240 
cpu time (seconds, -t) unlimited 
max user processes (-u) 99994 
virtual memory (kbytes, -v) unlimited


This is my nodejs code:

var http = require("http");

var PORT = 4000; 
var KEEPALIVE_SECS = 30;

http.createServer(function(request, response) { 
    response.writeHead(200, 'Ok'); 
    setInterval(function() { 
      response.write('keepalive'); 
    }, KEEPALIVE_SECS * 1000); 
}).listen(PORT);


If I run two processes (listening on different ports) I can handle 32K x 2 = 64K concurrent connections, so I think the limitation has something to do with the max number of file descriptors per process. Is there any limitation in nodejs I should be aware of?

I am using node 0.10.4. The virtual machine I am using is hosted at Joyent, and is plenty of resources (16GB RAM and 12 VCPUs).

Thank you in advance,
Guido.

mscdex

unread,
Jul 3, 2013, 10:20:49 PM7/3/13
to nod...@googlegroups.com
On Wednesday, July 3, 2013 6:37:10 PM UTC-4, Guido García wrote:

I am facing an issue when trying to keep more than 32768 concurrent HTTP connections in a nodejs server running on a SmartOS machine. Once I reach that number of concurrent connections, the server starts rejecting new HTTP requests.


Can you verify the process has the limits you set? Do: `cat /proc/<pid>/limits`

Michal Kruk

unread,
Jul 4, 2013, 3:42:43 AM7/4/13
to nod...@googlegroups.com
I supose he is testing on localhost in which case these numbers are a normal thing and you cannot do anything about it, if you want to make a real test buy some cloud machines to do the requests for you this way you can get to 1M and more connections in a single process :)

Pozdrawiam Michał Kruk


--
--
Job Board: http://jobs.nodejs.org/
Posting guidelines: https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to nod...@googlegroups.com
To unsubscribe from this group, send email to
nodejs+un...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en
 
---
You received this message because you are subscribed to the Google Groups "nodejs" group.
To unsubscribe from this group and stop receiving emails from it, send an email to nodejs+un...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

Guido García

unread,
Jul 4, 2013, 3:10:58 PM7/4/13
to nod...@googlegroups.com
I'm already on the cloud, I am using Joyent Smart Machines. It seems there is a hard limit (32768) in the number of concurrent connections per process in some place. It could even be a limitation imposed by the SmartDataCenter... any idea?

I am facing the same limit when I reproduce the same scenario on a Linux Ubuntu 12.04 machine.

It is a pity because I was thinking about breaking the Guiness World Record.

Thank you,
Guido.
PD. There is no /proc/<pid>/limits in SmartOS.

mscdex

unread,
Jul 4, 2013, 5:05:37 PM7/4/13
to nod...@googlegroups.com
On Thursday, July 4, 2013 3:10:58 PM UTC-4, Guido García wrote:
PD. There is no /proc/<pid>/limits in SmartOS.


"Solaris" has to be so difficult ;-)

It looks like the equivalent is the prctl command. You might give that a try instead?

Fedor Indutny

unread,
Jul 4, 2013, 5:24:07 PM7/4/13
to nod...@googlegroups.com
Are you sure you're not running out of ephemeral ports?

Cheers,
Fedor.


--

Matt

unread,
Jul 4, 2013, 5:54:42 PM7/4/13
to nod...@googlegroups.com

On Thu, Jul 4, 2013 at 5:24 PM, Fedor Indutny <fe...@indutny.com> wrote:
Are you sure you're not running out of ephemeral ports?

A server shouldn't run out of ephemeral ports ever (OK, in theory there's a max number which is around 140 trillion concurrent connections, but nobody is ever going to do that). You might run out with a client though, and I suspect that's what's happening here - testing from a single machine.

Matt.

Guido García Bernardo

unread,
Jul 4, 2013, 6:29:50 PM7/4/13
to nod...@googlegroups.com
I increased the number of ephimeral ports on the client before running it. In any case I don't think that is the issue, becasue when I run the same nodejs server process (on a different port), the two processes are able to handle 32K concurrent connections each (64K total).

Thanks for your help.


--
--
Job Board: http://jobs.nodejs.org/
Posting guidelines: https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to nod...@googlegroups.com
To unsubscribe from this group, send email to
nodejs+un...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en
 
---
You received this message because you are subscribed to a topic in the Google Groups "nodejs" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/nodejs/cRRS7ZJkyzc/unsubscribe.
To unsubscribe from this group and all its topics, send an email to nodejs+un...@googlegroups.com.

Matt

unread,
Jul 4, 2013, 6:38:09 PM7/4/13
to nod...@googlegroups.com

On Thu, Jul 4, 2013 at 6:29 PM, Guido García Bernardo <guido.garc...@gmail.com> wrote:
I increased the number of ephimeral ports on the client before running it. In any case I don't think that is the issue, becasue when I run the same nodejs server process (on a different port), the two processes are able to handle 32K concurrent connections each (64K total).

Can you be more clear here on what the problem is and what error you see?

On a CLIENT you cannot make more than 32k connections to the same IP/Port combination. It's impossible.

On a SERVER you can have trillions of incoming connections, BUT, they MUST come from different IP addresses up to a maximum of 32k per IP address.

Think of the table like this:

1) Local IP
2) LocalPort
3) Remote IP
4) RemotePort

1 is usually fixed unless you listen on more than one IP address.
2 is usually fixed (e.g. port 80 for http)
3 varies in IPv4 up to 4 billion-ish addresses
4 varies and is generally randomly chosen by the OS, but is in a fixed range up to 32k

If you fix 1 2 and 3 then you can only have 32k connections. But if you vary 3 then your number is limitless (practically).

Does that help?

Matt.

Jorge

unread,
Jul 5, 2013, 5:10:15 AM7/5/13
to nod...@googlegroups.com
On 05/07/2013, at 00:38, Matt wrote:

> 4 varies and is generally randomly chosen by the OS, but is in a fixed range up to 32k

64k - 1024: the "well known tcp ports" are never used as ephemeral.

Guido García

unread,
Jul 7, 2013, 3:35:11 PM7/7/13
to nod...@googlegroups.com
Guys, you were right. The limit is on the client.

I didn't realize the limit is 32k connections to the same IP/Port combination, and not just to the same IP address (thanks Matt). This is why one client was able to two to open 64k connections to two different ports, and that was confusing me.

Thank you,
Guido.
Reply all
Reply to author
Forward
Message has been deleted
0 new messages