Websocket Text vs Binary - Insane Performance Increases

2,487 views
Skip to first unread message

Nick Newman

unread,
Apr 22, 2016, 6:48:27 AM4/22/16
to nw.js
Hey guys,

I'm not posting this on Github because it's not really an issue. But, I did notice something when testing Websocket data. This was all done on the Normal v0.14.2 build (No Dev Tools open).

But basically, if you start sending a lot of text frames, for example, something like this:

createo = {};

for (var i = 0; i < 515; i++) {
    createo[i] = {
        "a": 31,
        "b": i,
        "c": "SWRC9",
        "d": "12312213",
        "e": [1, 2, 3, 4, 5],
        "f": {},
        "zz": 1461228009383,
        "g": 0,
        "h": 9300
    }


}
var string = JSON.stringify(createo); 
socket.send(string);

If you set this up as an Interval (for example, every 100ms) your MEMORY usage for NW.js will go up drastically. My client went up to around 200,000 K in just under 3 minutes. (And the Garbage Collector did not kick in yet) -- This is because I believe Websockets do something with utf-8 and they convert the buffer into utf-8 every request; and it fills up the garbage collector faster.

But, if you use the npm module: https://github.com/kawanet/msgpack-lite

And use a similar code block:

createo = {};

for (var i = 0; i < 515; i++) {
    createo[i] = {
        "a": 31,
        "b": i,
        "c": "SWRC9",
        "d": "12312213",
        "e": [1, 2, 3, 4, 5],
        "f": {},
        "zz": 1461228009383,
        "g": 0,
        "h": 9300
    }


}
var buffer = msgpack.encode(createo);
socket.send(buffer);


NW.js actually stays at roughly 75,000 K in memory. And continues to idle around there and this is even when sending the data at every 100ms.

I have tried BSON, PSON, msgpack, msgpack5, UBSON, etc. None of them are as fast as msgpack-lite.  

In any event, my point is: use Binary Websockets!!

And, thanks to Roger for such a great program, keep it up!

VoidVolker

unread,
Jun 28, 2016, 2:43:59 AM6/28/16
to nw.js
You test the speed of ecnoding to/from binary msgpack-lite vs text json? I'm early tested BSON - it was very slow, simple JSON.parse / stringify was much more faster.

> This is because I believe Websockets do something with utf-8 and 

In standard is nothing say about encoding, you can transfer any text or binary data via WS. Memory leak may be produces with some code around WS, not sure. I'm using WS long time and will test this case.
Reply all
Reply to author
Forward
0 new messages