Yes. You are correct. JSON.parse ""cheats"" cause its precompilled so no pure javascript can in theory have the same performance characteristics[1]
The advantages are:
1) having a sax like api (for my use case the parsed json is about as useful as the string itself)
2) memory usage as you said
3) JSON.parse cannot process chunks that aren't valid json
4) I think JSON.parse is blocking but would love someone here to confirm this. If this is the case you might rather spend some more time in clarinet while not blocking node. However this is unlikely to be a factor in most applications cause it means many users dealing with large json files at the same time
5) Clarinet works outside v8 too :)
Bottom line if in doubt use JSON.parse. It's likely that if you need something like clarinet you know need something like clarinet :)
Hope this helps
Nunoo
[1], check comment by Alex Russel at http://www.quora.com/Where-can-I-find-JSON-parser-used-in-V8-JavaScript-Engine
--
Job Board: http://jobs.nodejs.org/
Posting guidelines: https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to nod...@googlegroups.com
To unsubscribe from this group, send email to
nodejs+un...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en
Would an ideal use case for clarinet be the bulk uploading of records
as JSON where the data is in the form of an Array? eg:
{ data: [
{ ... },
{ ... },
...
] }
Then you'd be able to get an event back for each item added to the
'data' key's array?
Regards,
Micheil Smith
--
BrandedCode.com
Yes you will get one event fired for each part of the array. It's a good use case specially if you have no need for the document per se. But if the file is not chunked and or large enough why bother? Unless you want to learn about streaming parsers of course. Or if you like all things javascript :) Both good reasons on my book. Plus I did the stats on performance so you can reason with it. Is 1 second to parse 13M of json that slow? Probably for most applications the answer is no.
Mikeal was chasing for filters to do something like that and only selectively emit. Clarinet is Imo lower level than that but one could write an abstraction on top of clarinet that does that.
One thing that I'm sure is not helping with pure js parser performance are the conversions from buffers to strings for all the chunks in addition to the conversions in to javascript types.
and a declarative model for the parser so that it only selectively converted values to javascript types