Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Native JSON on Rhino, TraceMonkey and V8

859 views
Skip to first unread message
Message has been deleted

Joran

unread,
Aug 8, 2009, 4:37:09 AM8/8/09
to
Hi Raphael,
Since the announcement of the new native JSON object for Rhino, I've
done some naïve testing on Rhino, TraceMonkey and V8, using two
simple
functions:

var testparse = function(string) { var start = new Date().getTime();
JSON.parse(string); return new Date().getTime() - start; };

var teststringify = function(object) { var start = new Date().getTime
(); JSON.stringify(object); return new Date().getTime() - start; };

For "object", I used an array of 2082 objects. For "string", I used
the JSON representation of "object". The length of "string" is about
650kb. I've been running the functions about 20 times at a time on
each platform, a couple of times each over the last week, on a MacBook
Pro, and have found the following consistent averages:

JSON.parse:
Rhino: 640ms
TraceMonkey: 44ms
V8: 27ms

JSON.stringify:
Rhino: 140ms
V8: 70ms
TraceMonkey: 44ms

Is my method okay? Are these results to be expected?

Rapha

unread,
Aug 9, 2009, 7:06:35 AM8/9/09
to
Hi Joran,

The benchmarking method looks okay to me, and those results are about
what I'd expect at the moment. For one thing Rhino is simply a slower
engine than either of the other two in general. However, I do think
there's room for improvement in the performance, especially for the
parsing code. It would probably be best to profile it to see what's
actually taking up the time, but I suspect the fact that I'm using
regular expressions to tokenise the input might be a part of it. That
could be replaced with custom lexing code, at the cost of a little
more complexity though.

In any case, not a lot of thought has really been put into the
performance of the JSON methods so far. I've mostly just wanted to
make sure they're correct, as my main focus is to bring Rhino closer
to EcmaScript 5 compliance. Is it too slow for your purposes? It
should still be significantly faster than using the json2.js
implementation via javascript.

Actually the other proposed project for Rhino for the summer of code
this year was working on general performance optimisations for the
Rhino engine. That's something I'd like to see too.

https://wiki.mozilla.org/Community:SummerOfCode09#Rhino

Cheers,
Raphael

Joran

unread,
Aug 9, 2009, 10:49:52 AM8/9/09
to
Great, thanks Raphael for your response and the link, that sort of
performance optimization would be great, considering that Rhino
provides such a simple interface into Java from Javascript, which the
others lack. V8 from what I gather does complete compilation as
opposed to JIT tracing. The complete compilation takes a little more
startup latency but has been said to be simpler.

And thanks, the new JSON object is about twice as fast as json2.js,
and considering it's new there's still plenty of time for improvement,
it's great that we've got it. I'm using Rhino as a server-side JSON
api. The bulk of request time is spent parsing JSON from the database,
then serializing it for the client. So it is a little slow: if we
could drop to 44ms from 640ms that would be a dream. The slow JSON
parsing is also sending my CPU up to 100% and blocking other threads.
Rhino speed in general however would be fine as it is for my purposes,
a faster JSON object would be the killer win.

Thanks for your work.

Norris Boyd

unread,
Aug 10, 2009, 11:15:04 AM8/10/09
to

Raphael and I discussed the approach to take for JSON parsing and, as
he indicated, we decided to go for the simplest approach given the
time constraints of the Google Summer of Code program. I'd be happy
for people to put together patches that improve performance of JSON
parsing. If the regexp tokenizer is indeed the slow part of the code,
perhaps the tokenizer in Rhino could be used, for example.

Thanks,
Norris

Joran

unread,
Aug 10, 2009, 2:33:33 PM8/10/09
to
Thanks Norris, I would help if I could, unfortunately my knowledge of
Java is limited to scripting from Javascript! Poor indeed.

But perhaps this could help, from the V8 discussion on similar JSON
performance issues:

"Rather than hacking some basic JSON support into V8 or your V8
application, I recommend that it be done ... using Ragel:
http://www.complang.org/ragel/

Ragel's compiled state machines are blindingly fast, and hence so are
the implementations of JSON of languages that use it:
http://json.rubyforge.org/
http://modules.gotpike.org/module_info.html?module_id=43

I did some elementary benchmarking of JSON implementations a while
back, and JSON in Ragel was in a class of its own."

Excerpt from: http://www.mail-archive.com:80/v8-u...@googlegroups.com/msg00243.html

Hannes Wallnoefer

unread,
Aug 11, 2009, 6:33:03 AM8/11/09
to
First, I'd like to thank Raphael for his great work on the ES5
implementation, and to Norris for his mentoring. This is incredibly
useful work, and I agree with your approach of putting completeness
and correctness over performance.

I've been working on a Rhino wrapper for Berkeley DB that uses JSON as
data format, and I wanted to know how fast the JSON parser could get.
Since JSON is such a simple format, I decided to try rewriting the
JSON parser by hand, and it turned out indeed quite a bit faster. I
filed a new bug with the patch:

https://bug509678.bugzilla.mozilla.org/attachment.cgi?id=393739

Performance should now be close to tracemonkey and V8. With the -
server JVM I get even faster than my Firefox 3.5.2 (I don't have
recent spider/tracemonkey/V8 builds with native JSON support for
testing).

In the context of my Berkeley DB wrapper, running a query that returns
a few thousand objects now takes ~80 millis for just the ids (no JSON
parsing involved) against ~120 millis for the full, JSON-parsed
objects. I just scrapped my plans for caching of parsed objects - JSON
isn't the bottleneck anymore.

Hannes

Hannes Wallnoefer

unread,
Aug 11, 2009, 6:36:19 AM8/11/09
to
On Aug 11, 12:33 pm, Hannes Wallnoefer <hann...@gmail.com> wrote:
>
> https://bug509678.bugzilla.mozilla.org/attachment.cgi?id=393739
>

Sorry, that's the link to the patch, not the bug. This is the correct
link:

https://bugzilla.mozilla.org/show_bug.cgi?id=509678

Hannes

Rapha

unread,
Aug 12, 2009, 4:32:59 AM8/12/09
to
Awesome, thanks Hannes!

Hannes Wallnoefer

unread,
Aug 12, 2009, 5:40:46 AM8/12/09
to
I committed the patch. Please let me know if you see any problems with
it.

Hannes

Johan Compagner

unread,
Aug 12, 2009, 6:01:26 AM8/12/09
to dev-tech-js-...@lists.mozilla.org
now if joran could do its test again :)

> _______________________________________________
> dev-tech-js-engine-rhino mailing list
> dev-tech-js-...@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-tech-js-engine-rhino
>

Joran

unread,
Aug 12, 2009, 10:44:48 AM8/12/09
to
Thanks Hannes. This is great.

After some quick playing, and using the same testparse and
teststringify functions described above, here are some results. For


"object", I used an array of 2082 objects. For "string", I used the
JSON representation of "object". The length of "string" is about

650kb. I didn't try anything larger than that.

JSON.parse:
Rhino (before Hannes' patch): 640ms
Rhino (after Hannes' patch): 90ms
TraceMonkey: 44ms
V8: 27ms

JSON.stringify:
Rhino (before Hannes' patch): 140ms
Rhino (after Hannes' patch): 165ms
V8: 70ms
TraceMonkey: 44ms

Thanks again, this will mean I also won't have to put a parsed-object
cache in front of my key-value store (Tokyo Tyrant).

Marcello Bastéa-Forte

unread,
Aug 12, 2009, 2:03:57 PM8/12/09
to Joran, dev-tech-js-...@lists.mozilla.org
Curious, since 27ms is kinda low (and risks overhead/timecounting
inaccuracy), have you tried doing the JSON commands multiple times inside
your benchmark loop?

i.e.

var testparse = function(string,testcount) {


var start = new Date().getTime();

for (var i=0;i<testcount;i++) {


JSON.parse(string);
}
return new Date().getTime() - start;
};

var teststringify = function(object,testcount) {


var start = new Date().getTime();

for (var i=0;i<testcount;i++) {


JSON.stringify(object);
}
return new Date().getTime() - start;
};

Also, trying with higher numbers of iterations on smaller JSON strings may
help give a better "real-world" performance metric. I'm guessing most json
strings are more on the order of 10kb rather than 650kb.

Marcello

Joran

unread,
Aug 13, 2009, 1:52:52 AM8/13/09
to
Thanks Marcello, yes 27ms is desirably low. I tried your suggestion
with the same 650kb string and a testcount of 100, numerous times, and
V8 consistently returns with 2500ms, i.e. 25ms. Firefox 3.5 is about
44ms. TraceMonkey the same. Safari 4 slightly faster.

Perhaps a spread of differently sized strings may be better to give an
indication of how the JSON object performs under load.

Hannes Wallnoefer

unread,
Aug 13, 2009, 10:56:09 AM8/13/09
to
I have some more results. I updated my tracemonkey and v8 builds to
their current svn/hg heads and then compared them to rhino head using
the script below. It's basically a JSON array with 500 objects
containing some string/number/boolean properties, parsed 500 times. I
called the bench() function repeatedly to make sure the Hotspot
compiler had done its job (the tracemonkey and v8 JIT compilers don't
do anything on this level, obviously).

function bench() {
var t0 = Date.now();
for (var i = 0; i < 500; i++)
JSON.parse(src);
return Date.now() - t0;
}

var array = [];
for (var i = 0; i < 500; i++)
array.push({foo: "BAR", x: 12309, y: false,
bar: "sdflkjsldfkjlk sdflkj lsdkjf lkjsd flkjdsf"});
var src = JSON.stringify(array);

Here are the results:

V8: ~1250 millis
Tracemonkey: ~800 millis
Rhino with client hotspot VM: ~2000 millis
Rhino with server hotspot VM ~800 millis

Hannes

0 new messages