Socket Memory Leak (sad story warning)

1,227 views
Skip to first unread message

Hugo

unread,
Dec 3, 2011, 6:24:54 PM12/3/11
to nod...@googlegroups.com
Soooo... my node server does wat node does best, a.k.a taking http(s) requests on the front-end, collecting data from various back-ends (including mongo, web apis etc) before packaging the data for the client and returning a responses.

Turns out the server is leaking memory, which isn't cool but hey it's life. Always helpful node-inspector tells me I have 140 Socket and 355 IOWatcher objects in the heap. At any point in time I would expect to see a few of these (maybe in the 10-20 range to account for 3 or 4 client connections and 3x times as many back-end connections), but 140 & 355 seem suspiciously high so I suspect some of my sockets are stuck forever in the heap.

To try and narrow it down to a leak related to front-end connections I run netstats which tells me that my server has 124 sockets in ESTABLISHED state on my main front-end port (8443 'coz they are https connections). That correlates well with the 140 Socket objects stuck in the heap so I am pretty sure now that some connections just never get released (many clients are mobile phone, so unplanned disconnects are relatively frequent, which may be a factor).

I tried a few things from there, none of them successful so far (that's the sad part)
* Implemented setTimeout(callback) on my incoming sockets to detects sockets idle for a while ... surprisingly, my callback was never called
* I listen to the close event to clean things up after an unplanned disconnect (destroy()'ing the socket when this happens)


Kinda stuck at this stage...if you feel you have the power to turn sad story into a happy one.... please, please share suggestions and ideas !

Jann Horn

unread,
Dec 3, 2011, 6:27:02 PM12/3/11
to nod...@googlegroups.com
2011/12/4 Hugo <hha...@gmail.com>:

> Soooo... my node server does wat node does best, a.k.a taking http(s)
> requests on the front-end, collecting data from various back-ends (including
> mongo, web apis etc) before packaging the data for the client and returning
> a responses.

"I have a phone and I somehow don't manage to turn it on."

Could you please show us what you're actually doing, that is, your code?

Marak Squires

unread,
Dec 3, 2011, 6:29:40 PM12/3/11
to nod...@googlegroups.com
Dumb question, are you listening to all events on socket?

error, drain, timeout, close

--
Job Board: http://jobs.nodejs.org/
Posting guidelines: https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to nod...@googlegroups.com
To unsubscribe from this group, send email to
nodejs+un...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en



--
-- 
Marak Squires
Co-founder and Chief Evangelist
Nodejitsu, Inc.

Diogo Resende

unread,
Dec 3, 2011, 7:22:50 PM12/3/11
to nod...@googlegroups.com
On Sat, 3 Dec 2011 15:24:54 -0800 (PST), Hugo wrote:
> in the heap so I am pretty sure now that some connections just never
> get released (many clients are mobile phone, so unplanned disconnects
> are relatively frequent, which may be a factor).

I have a server that gets connections only from mobile devices (not
phones,
but they use 3G phone networks), and I get the same behavior. The
devices
sometimes disconnect but the proxy on the 3G network never triggers
that
to my side and I don't have any way of knowing it.

I did this test:

- Connect a device to my server using 3G;
- Disconnect the device (remove battery);

I never received an event and I set a setTimeout callback to trigger
(after
the device is off) to send data to the socket and check if any errors
ocurred.
No errors.

I don't think this is a memory leak. I don't get this behaviour from
anything
else other from devices from 3G network. Really weird, I just schedule
service
restart every week.

---
Diogo R.

Hugo

unread,
Dec 3, 2011, 8:34:53 PM12/3/11
to nod...@googlegroups.com
To @Jann's point let me try to provide additional context (actual code would be wayyy to long to read): the server hosts a plain http API application, http(s) request come in, json response out, no streamin' or anything creative with protocols.

Basically the app is much like the Hello World canonical app, the gist below is a good way to describe what goes on at very high level

var http = require('http');
http.createServer(function (req, res) {
  
  var timeout = setTimeout( function() {
     res.writeHead(200, {'Content-Type': 'application/javascript'});
     res.end('{Hello: 'world'}');
  }, 10000) ;
  req.connection.addEventListener('close', function(){ 
    req.connection.destroy();
    clearTimeout(timeout) ;
  })
}).listen(1337, "127.0.0.1");

@Marak...dumb questions work, I'm a big fan :) I am listening to 'close' because it fires when the client goes away abruptly in my tests, but I am not currently listening to 'error' or 'timeout', and they are definitely good candidate (although not sure how I should handle either event...suggestions welcome). At the very least I'll start listening to these events in production if only to see if they get fired & correlate w/ memory usage

billywhizz

unread,
Dec 4, 2011, 1:26:54 AM12/4/11
to nodejs
Hugo,

you should listen for timeout and close the connection if you get it:
http://nodejs.org/docs/v0.6.4/api/net.html#event_timeout_

and error will tell you about an error, after which you *should*
receive a close:
http://nodejs.org/docs/v0.6.4/api/net.html#event_error_

Shripad K

unread,
Dec 4, 2011, 1:40:22 AM12/4/11
to nod...@googlegroups.com
This seems so similar to what I faced recently. Are you using some proxy in front of node? If yes, then which one?


Hugo

unread,
Dec 5, 2011, 3:53:29 AM12/5/11
to nod...@googlegroups.com
@billywhiz Sounds like a sane idea, I'll try that next

@shripad Yes indeed, the issue seem really similar to the issue you fixed on bouncy...and...err...I send an email to the author of the bouncy pull request, not realizing it was you.
Long story short, I am not using a proxy, I am not using streams, but I am seeing a high and growing number of ports in the ESTABLISHED state. I'd love to understand you rationale for cleaning up the stream on the connection's 'close' event and how that relates to the port state as I suspect that a variation of this may apply to me

Thanks!

Hugo

unread,
Dec 8, 2011, 2:49:52 PM12/8/11
to nod...@googlegroups.com
I ended up cleaning up portions of my code, in particular I had some inconsistencies where socket events weren't managed properly (I seem to have forgotten that sockets are shared across many requests in some areas of the code, so I was either over-unsubscribing events or over-subscribing them...the unfortunate result of late night coding maybe!)

Things seem to have stabilized now, carefully managing socket event subscription and calling req.connection.end() on the 'timeout' event seems to be the key

Thank you all!
Reply all
Reply to author
Forward
0 new messages