--
--
Job Board: http://jobs.nodejs.org/
Posting guidelines: https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to nod...@googlegroups.com
To unsubscribe from this group, send email to
nodejs+un...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en
---
You received this message because you are subscribed to the Google Groups "nodejs" group.
To unsubscribe from this group and stop receiving emails from it, send an email to nodejs+un...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Yeah I used a similar approach with mongo.
Is it not possible to limit the number of events emitted by json stream?
Sent from my phone, apologies for brevity
You received this message because you are subscribed to a topic in the Google Groups "nodejs" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/nodejs/j15sn3hg_Pc/unsubscribe.
To unsubscribe from this group and all its topics, send an email to nodejs+un...@googlegroups.com.
Yeah I used a similar approach with mongo.
Is it not possible to limit the number of events emitted by json stream?
I have a similar use case, only with xml files. At the end of my pipeline is a writable stream I called a cradlepusher because I used the cradle module.
It's implemented from stream2 writable stream which will handle back pressure for you when implemented correctly. The whole point if these is that they're easier to get right than old streams.
CradlePusher.prototype._write = function _write(resource, encoding, callback) {
if(typeof resource.id === 'undefined') {
return callback(new TypeError('Given resource does not have an ID'));
}
var self = this;
this._db.get(resource.id, function(err, doc) {
if (!err) {
self._updateDoc(doc, resource, callback);
}
else if (err.error === 'not_found' && err.reason === 'missing') {
self._pushNewDoc(resource, callback);
}
else {
return callback(err);
}
});
};
My use case also has an update logic, which is why I'm not using bulk updates anymore and I have to get a doc before updating it. But the interesting part is the use of the callback of the write method. Calling it at the right time will get your back pressure handles more graciously.
var ez = require('ez-streams');
function jsonFileToMongo(errHandler, sourceFileName, targetMongoCollection) {
var reader = ez.devices.file.text.reader(sourceFileName);
var writer = ez.devices.mongodb.writer(targetMongoCollection);
reader.transform(ez.transforms.json.parser()).pipe(errHandler, writer);
}
--
--
Job Board: http://jobs.nodejs.org/
Posting guidelines: https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to nod...@googlegroups.com
To unsubscribe from this group, send email to
nodejs+un...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en
---
You received this message because you are subscribed to a topic in the Google Groups "nodejs" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/nodejs/j15sn3hg_Pc/unsubscribe.
To unsubscribe from this group and all its topics, send an email to nodejs+un...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.