HTTP Client receives "socket hang up" error instead of response callback

3,242 views
Skip to first unread message

Bruno Jouhier

unread,
May 27, 2011, 6:38:06 PM5/27/11
to nodejs
I hit a strange bug today: I have a proxy that uses an HTTP client to
forward requests. In some situations, the client does not receive the
response callback but gets a "socket hang up" error instead.

I managed to reproduce it reliably with the following setup:

In my process, I have an HTTP server.
When the server receives a request on /foo, it sends N parallel
requests to /bar. It concatenates the N responses received and returns
the result.
When the server receives a request on /bar, it proxies it to /zoo
When the server receives a request on /zoo, it returns a text
response.

If N < 5, everything works fine.
If N >= 5, the second layer HTTP clients (calling /zoo) get a "socket
hang up" after approx 2 minutes instead of a response immediately.
Then the first layer HTTP clients (calling /bar) receive a response.

If I shortcircuit the proxying from /bar to /zoo by having /foo
forward directly to /zoo, I don't have the problem. I can set N to 100
without any problem.

I have created two gists:
https://gist.github.com/996245 source code of my test program.
https://gist.github.com/996238 transformed version of the test
program.

The test program sends PUT requests. But I changed it to use GET
requests instead and it did not make any difference. So the problem is
not due to fact that the request contains a body.

I was running 0.4.7 on mac OS X this morning. I upgraded to 0.4.8 but
it did not make a difference.

Strange!

Bruno

Bruno Jouhier

unread,
May 30, 2011, 5:18:51 PM5/30/11
to nodejs
I'm reposting because I think that there is a serious bug here.

I've created a slightly simpler version that demonstrates the problem
with a more natural scenario.
The new scenario is the following:

N HTTP clients send a request to a /proxy?/i URL on the server, in
parallel.
The server forwards these requests to the /i URL with an HTTP client.

If N < 5, everything works fine.
If N >= 5 the proxy gets a "socket hang up" when forwarding the
request.

The following code creates only 4 clients. It works fine and produces
the following output:
result=/1/2/3/4

But if you add one element to the [1,2,3,4] array, it fails after 2
minutes with:
socket hang up

You can run it with node-streamline bug_ (you have to install
streamline with -g option)
Node version is 0.4.8 on mac OS X.

Bruno

--------- begin bug_.js ----------------
var streams = require("streamline/lib/streams/server/streams");
var flows = require("streamline/lib/util/flows");
var port = 3001;

function send(_, path){
return streams.httpRequest("http://localhost:" + port +
path).end().response(_).readAll(_);
}

function reply(response, statusCode, str){
response.writeHead(statusCode, {
"content-type": "text/plain"
}).end(str);
}

var server = new streams.HttpServer(function(request, response, _){
if (request.url.indexOf("/proxy?url=") == 0)
reply(response, 200, send(_, request.url.substring(11)));
else
reply(response, 200, request.url);
}).listen(_, port);

try {
// Use futures to send in parallel
// Add one element to the array to see the problem
var futures = [1, 2, 3, 4].map(function(i){
return send(null, "/proxy?url=/" + i);
});
// collect all the results.
var result = flows.collect(_, futures).join("");
console.log("result=" + result);
}
catch (ex) {
console.log(ex.message + "\n" + ex.stack);
}

server.close(_);
--------- end bug_.js ----------------

billywhizz

unread,
May 30, 2011, 10:16:05 PM5/30/11
to nodejs
Bruno, I think it would be a lot easier if you could post up standard
node.js code that shows the "bug" as there are too many things to
consider when your streamline.js code is used for the example.
personally, it gives me a massive headache looking at all those
underscores...

Ted Young

unread,
May 31, 2011, 1:04:42 AM5/31/11
to nod...@googlegroups.com
Yeah sorry there's a lot of code in there, it's a little difficult to
parse. agent.maxSockets defaults to 5 though, so 6 requests is a
pretty suspicious number to get a socket timeout. Maybe you're getting
caught by proxying to the same domain in the same proccess, and the
proxy can't get a socket because the initial requests used them all?

Ted

> --
> You received this message because you are subscribed to the Google
> Groups "nodejs" group.
> To post to this group, send email to nod...@googlegroups.com.
> To unsubscribe from this group, send email to nodejs+un...@googlegroups.com
> .
> For more options, visit this group at http://groups.google.com/group/nodejs?hl=en
> .
>

Bruno Jouhier

unread,
May 31, 2011, 3:52:23 AM5/31/11
to nodejs
Hi Ted,

Thanks for the pointer. Yes, it is due to agent.maxSockets. If I bump
it it works.

And yes, the program is proxying to the same domain in the same
process. Why should that matter?

What I don't understand is why it hangs. agent.maxSockets should limit
the number of sockets open at one time. If you request more than the
max, the extra requests should be queued and they should be handled a
bit later, as the responses start coming back and sockets get returned
to the pool. They should not block the whole thing. This looks
abnormal.

I've converted the code to "classical", for those would find a hairy
send function with 4 callbacks easier to "parse" than the following
one-liner:

function send(_, url){ return
streams.httpRequest(url).end().response(_).readAll(_); }

Bruno

--------- begin bug.js ----------------
var http = require("http");
var port = 3001;

var count = 5;
//http.Agent.defaultMaxSockets = 6;

function send(path, callback){
var result = "";
http.request({
host: "localhost",
port: port,
path: path,
method: 'GET'
}, function(res){
res.setEncoding('utf8');
res.on('data', function(chunk){
result += chunk;
});
res.on('end', function(){
callback(null, result);
});
}).on('error', function(err){
callback(err);
}).end();
}

function reply(response, statusCode, str){
response.writeHead(statusCode, {
"content-type": "text/plain"
});
response.end(str);
}

var server = http.createServer(function(request, response){
if (request.url.indexOf("/proxy?url=") == 0) {
send(request.url.substring(11), function(err, result){
if (err)
reply(response, 500, err.message);
else
reply(response, 200, result);
})
}
else {
reply(response, 200, request.url);
}
});

server.listen(port, function(){
console.log("server ready, sending messages")
var received = 0;
for (var i = 0; i < count; i++) {
send("/proxy?url=/" + i, function(err, result){
if (err)
console.log("ERROR: " + err.message);
else
console.log("OK: " + result);
if (++received == count)
server.close();
});
}
});
--------- end bug.js ----------------

Ted Young

unread,
May 31, 2011, 12:34:49 PM5/31/11
to nod...@googlegroups.com
Yeah pretty sure maxSockets is what you are hitting. When you start,
you create five requests to localhost. Those requests can't end until
their proxies also complete, and the proxies are all queued up on
localhost waiting for one of the initial requests to end so they can
get a socket. So... deadlock! If the proxy was in a separate
process, it would have a separate queue and you wouldn't see this
behavior.

Boy howdy is maxSockets is a useful convenience that can really screw
with you if you are not expecting it! This example is kind of a
contrived edge case, but I can see this spilling over into the real
world as we start to write more complicated software.

BTW I wasn't trying to knock streamline, it looks very concise but
could also be a source of bugs, so seeing a test without it was helpful.

Ted

Mikeal Rogers

unread,
May 31, 2011, 1:17:04 PM5/31/11
to nod...@googlegroups.com
good way to test if it's a pooling issue is to do http.request({agent:false}), that will force a new Agent instance for every http call.
Message has been deleted

pigmej

unread,
May 31, 2011, 11:42:34 AM5/31/11
to nodejs
Hello,

I just hit exactly that problem.

So for clarify it in 'short' message

When you will try to make more http requests than the
http.Agent.defaultMaxSockets the requests will be 'killed' instead of
being queued in memory.

For me that's expected behavior of 'keep-alive' support: "Keep 10
sockets active and queue the 20 requests to use them", not "Keep 10
sockets, make use of 10 requests, the rest (10) kill".

Bruno Jouhier

unread,
May 31, 2011, 6:12:36 PM5/31/11
to nodejs
Hi Ted,

Thanks a lot for the explanation. I think I get the picture.

I'll shortcircuit the HTTP layer for local requests anyway but this is
tricky and I bet that some people will get caught by this one.

Bruno.

Bruno Jouhier

unread,
May 31, 2011, 6:37:30 PM5/31/11
to nodejs
Interesting. I assume that there is overhead in setting agent to
false. Would be nice to have "agent managers" so that different
components can pick their agents from different managers. This could
avoid some deadlocks (but not all).

Bruno
Reply all
Reply to author
Forward
0 new messages