Writing 500kb of JSON

880 views
Skip to first unread message

P. Douglas Reeder

unread,
Oct 27, 2012, 11:02:13 PM10/27/12
to nod...@googlegroups.com
One of the things my app needs to do is write a large JSON file to disk.  Currently, it's implemented naively:

writeStream = fs.createWriteStream(process.cwd() + "/staticRoot/album.json", {"encoding": "utf8"});

writeStream.addListener("error", function (error) {

console.error("album.json:", error);

if (! writeError)   // preserve 1st error

writeError = error;

callback(writeError);   // on error, abort writing next file

});


writeStream.end(JSON.stringify(album));


This appears to work ok when the JSON file is 150k (the largest size I can readily test).  It needs to work when the JSON file is up to 500k.  i'm not sure how big the write buffer is, but I lack confidence that this is the right way to go.  The bulk of the JSON file is an array of objects, so I could create the JSON file one array item at a time, checking the return value from writeStream.write() in a manner analogous to the following HTML file writing:

writeStream.addListener("drain", writeUntilBufferFull);


writeStream.write("<h1>" + title + "</h1>\n");

writeStream.write("<table>\n");

  

function writeUntilBufferFull() {

var longDate, urlFileName, htmlFileName, row;

console.log("writeUntilBufferFull()   p=", p, "   metadata:", JSON.stringify(pictureMetadata[p]));

while (p < pictureMetadata.length) {

longDate = pictureMetadata[p].date ? pictureMetadata[p].date.toLocaleDateString() : "";

urlFileName = encodeURIComponent(pictureMetadata[p].fileName).replace(/'/g, '%27');

htmlFileName = pictureMetadata[p].fileName.replace(/&/g, "&amp;").replace(/</g, "&lt;").replace(/>/g, "&gt;").replace(/"/g, "&quot;");

row =  " <tr><td align='right'>" + (p+1) + "</td><td align='right'>" + longDate + 

"</td><td><a class='gallery' href='pictures/" + urlAlbumName + "/" + urlFileName + "' title='" + longDate + "'>" + htmlFileName + "</a></td>" +

"<td><a download=\"" + htmlFileName + "\" href='full-size/" + urlAlbumName + "/" + urlFileName + "'>full-size</a></td></tr>\n";

if (! writeStream.write(row))

break;

++p;

}


if (p < pictureMetadata.length) {

++p;

} else {

writeStream.removeListener("drain", writeUntilBufferFull);


writeStream.end();

callback(writeError);

}

}


What's the best strategy for writing large JSON files?


mscdex

unread,
Oct 27, 2012, 11:09:04 PM10/27/12
to nodejs
On Oct 27, 11:02 pm, "P. Douglas Reeder" <reeder...@gmail.com> wrote:
> What's the best strategy for writing large JSON files?

You could look into using a streaming JSON module. Here's one you
might try: https://github.com/dominictarr/JSONStream

dolphin278

unread,
Oct 28, 2012, 1:29:55 AM10/28/12
to nod...@googlegroups.com
--
Job Board: http://jobs.nodejs.org/
Posting guidelines: https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to nod...@googlegroups.com
To unsubscribe from this group, send email to
nodejs+un...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en

P. Douglas Reeder

unread,
Oct 28, 2012, 10:40:15 AM10/28/12
to nod...@googlegroups.com
Thanks! I'll give JSONStream a try.

On Sunday, October 28, 2012 1:30:24 AM UTC-4, Boris Egorov wrote:

Ben Noordhuis

unread,
Oct 29, 2012, 7:49:04 AM10/29/12
to nod...@googlegroups.com
Depends on your definition of 'large' and how often you need to write
that data. If it's just a one-off or relatively rare thing, use
fs.writeFile() and call it a day, i.e.:

var data = JSON.stringify(obj);
fs.writeFile('/path/to/file', data, 'utf8', cb);
Reply all
Reply to author
Forward
0 new messages