cursor returned from find takes minutes to start iterating over result set

9 views
Skip to first unread message

Allan Edwards

unread,
Apr 12, 2015, 10:48:55 AM4/12/15
to node-mong...@googlegroups.com
Hi everyone... I am running a 3.0.1 version of mongodb on Ubuntu Linux.  I have a node.js web application that pulls some data from the server.  What is weird is I run the find command and immediatly get a cursor back with no errors.  Then when I call any method on the cursor that is returned it blocks for a very long time before returning via the callback.  I am using the 2.x driver for node.js.

Here is some sample code...would anyone have any idea why this request is blocking for so long?

GLOBAL.db.collection("data").find({ dataId : dataId }, { _id : 1, data : 2 }, { "sort" : [[ '_id', 1 ]] }, function(err, docscursor) {
if (err) {
cb(err);
}
else {
docscursor.batchSize(1, function(err, thisobject) {
var doc = null;

async.doWhilst(
function(callback) {

docscursor.next(function (err, adoc) {
doc = adoc;

async.series([
function(callback){
self.logTransfer(securityUserId, doc.data.length(), function (err) {
callback(err);
});
},
function(callback){
res.write(new Buffer(doc.data))
callback(null);
}
],
function (err) {
callback(err);
}
);
});
},
function test() {
if (doc == null)
return false;
else
return true;
},
function(err) {
cb(err);
});
});
}
});

Thanks!!!
Allan

Christian Amor Kvalheim

unread,
Apr 13, 2015, 4:19:18 AM4/13/15
to node-mong...@googlegroups.com
Hi Allan

Setting a batch size of 1 is always bad as it forces the driver to perform a roundtrip to the server for each document causing massive slowdown of the performance. This is a  situation where you can simplify your code by using streams instead the complexity of dealing with async.

Reply all
Reply to author
Forward
0 new messages