Memory leak in nodejs client streaming rpc?

377 views
Skip to first unread message

Abhishek Parmar

unread,
Sep 11, 2015, 1:55:56 PM9/11/15
to grp...@googlegroups.com, Michael Lumish
Hi,
We are running on grpc-0.9 and have a nodejs application that makes a rpc to a streaming server that spits out a message (about 2KB in size) once every second, forever.

The RSS of nodejs process goes over time without doing anything else (we are trying to get a more isolated test case to be sure).

If we kill the streaming server (uses grpc c++) the RSS stops growing. Restarting the server makes the RSS of nodejs process grow again.

Making the streming server spit our messages faster makes the RSS grow faster.

Using nodejs memory profiling tools like heapdump shows a very small fraction of the RSS. 

Is it possible there are memory leaks in the nodejs native grpc module? Do you guys have a stress test like this?

-Abhishek

Abhishek Parmar

unread,
Sep 11, 2015, 3:03:52 PM9/11/15
to grp...@googlegroups.com, Michael Lumish, Jens Schmidt
+jens who is helping me track this down on our side.

It definitely looks like GPRC streaming related. See the attached snippet of code (happy to provide more details if needed).

We ran it like this:

$ node  --expose-gc  grpc-stream.js

82 messages received { rss: 25640960, heapTotal: 16571136, heapUsed: 5371080 }
171 messages received { rss: 26972160, heapTotal: 16571136, heapUsed: 5709536 }
256 messages received { rss: 26972160, heapTotal: 16571136, heapUsed: 9481616 }
341 messages received { rss: 27783168, heapTotal: 16571136, heapUsed: 9269336 }
425 messages received { rss: 28323840, heapTotal: 16571136, heapUsed: 8878176 }
510 messages received { rss: 28323840, heapTotal: 16571136, heapUsed: 8507640 }
592 messages received { rss: 29671424, heapTotal: 16571136, heapUsed: 7996336 }
665 messages received { rss: 29671424, heapTotal: 16571136, heapUsed: 7100616 }
747 messages received { rss: 30752768, heapTotal: 16571136, heapUsed: 6591592 }
828 messages received { rss: 30752768, heapTotal: 16571136, heapUsed: 6046728 }
916 messages received { rss: 32104448, heapTotal: 16571136, heapUsed: 5802144 }
1002 messages received { rss: 32641024, heapTotal: 16571136, heapUsed: 9590496 }
1087 messages received { rss: 33722368, heapTotal: 16571136, heapUsed: 9217016 }
1175 messages received { rss: 33722368, heapTotal: 16571136, heapUsed: 8970088 }
1256 messages received { rss: 34533376, heapTotal: 16571136, heapUsed: 8423320 }
1341 messages received { rss: 34803712, heapTotal: 16571136, heapUsed: 8045928 }
1424 messages received { rss: 35885056, heapTotal: 16571136, heapUsed: 7587208 }
1511 messages received { rss: 36155392, heapTotal: 16571136, heapUsed: 7299904 }
--
-Abhishek

grpc-stream.js

Michael Lumish

unread,
Sep 14, 2015, 1:45:26 PM9/14/15
to Abhishek Parmar, grp...@googlegroups.com, Jens Schmidt
OK, I can definitely reproduce this with the latest version of gRPC. I'll investigate, and see if I can find what's leaking.

Abhishek Parmar

unread,
Sep 14, 2015, 2:03:27 PM9/14/15
to Michael Lumish, grp...@googlegroups.com, Jens Schmidt
Thanks, let us know if you need any more information.  I guess the easiest might be to use LD_PRELOAD to use tcmalloc and use HEAPPROFILE with nodejs binary and get some stack traces (I am assuming the leak is ion the native module).
--
-Abhishek

Michael Lumish

unread,
Sep 14, 2015, 6:05:18 PM9/14/15
to Abhishek Parmar, grp...@googlegroups.com, Jens Schmidt
I didn't actually know about HEAPPROFILE; you're right, that helped a lot. I was able to track down a couple of problems that appear to contribute to the majority of the memory usage you were seeing. This pull request has the details: https://github.com/grpc/grpc/pull/3339

Abhishek Parmar

unread,
Sep 14, 2015, 6:24:49 PM9/14/15
to Michael Lumish, grp...@googlegroups.com, Jens Schmidt
Awesome. Do you think this can be cherrypicked into 0.11 release soon?  Will it be possible to cherrypick into the 0.9 branch as well  (I was hoping to upgrade to 0.11 before our upcoming release, but got delayed due to a few bugs and now there is not enough time).


--
-Abhishek

Michael Lumish

unread,
Sep 14, 2015, 6:56:48 PM9/14/15
to Abhishek Parmar, grp...@googlegroups.com, Jens Schmidt
Once it gets merged, I'll look into backporting it to older versions.
Reply all
Reply to author
Forward
0 new messages