java driver packet size limit

85 views
Skip to first unread message

sirpy

unread,
Jan 15, 2011, 2:44:31 PM1/15/11
to mongodb-user
i'm trying to insert/read documents in batches using the java driver,
and i've noticed that if I use batch size of 100+ to read documents
the results are inconsistent and don't retrieve all the results. same
thing if i try to insert 100+ documents not all the documents are
saved.
the documents are about 10k in size.
is there some kind of packet size limit in mongo/driver?

Scott Hernandez

unread,
Jan 15, 2011, 3:04:35 PM1/15/11
to mongod...@googlegroups.com
There is a limit and it is the maximum bson size (basically), but it
will break that up into multiple requests/responses, and not drop
data.

Are you doing a "safe' insert? Are there any errors?

http://mongodb.org/display/DOCS/Last+Error+Commands


Do you have a sample of what you are doing?

> --
> You received this message because you are subscribed to the Google Groups "mongodb-user" group.
> To post to this group, send email to mongod...@googlegroups.com.
> To unsubscribe from this group, send email to mongodb-user...@googlegroups.com.
> For more options, visit this group at http://groups.google.com/group/mongodb-user?hl=en.
>
>

Eliot Horowitz

unread,
Jan 15, 2011, 3:04:59 PM1/15/11
to mongod...@googlegroups.com
No limit...
Can you send your code, state before and and after, and driver version #

sirpy

unread,
Jan 19, 2011, 8:09:51 AM1/19/11
to mongodb-user
ok some updates.
i've attached a sample code, which is very basic.
i've made the following tests.
tried with batchsize 500,stops after 612 objects.
tried with batchsize 50 works fine.
the collection is sharded so i've also tried querying one of the
shards instead of querying the mongos(on my localhost port 10000) with
batch size 500.
and it works!
so maybe the bug is in the sharding environment!.

Mongo db = new Mongo("localhost", 22222);
DB queries = db.getDB("queries");
DBCollection col = queries.getCollection("basicQueryObjects1901");
DBCursor res = col.find().batchSize(500).limit(10000);
int count=0;
while(res.hasNext())
{
count++;
DBObject obj = res.next();
System.out.println(count);
}


On Jan 15, 10:04 pm, Eliot Horowitz <eliothorow...@gmail.com> wrote:
> No limit...
> Can you send your code, state before and and after, anddriverversion #
>
>
>
>
>
>
>
> On Sat, Jan 15, 2011 at 2:44 PM, sirpy <had...@gmail.com> wrote:
> > i'm trying to insert/read documents in batches using thejavadriver,
> > and i've noticed that if I use batch size of 100+ to read documents
> > the results are inconsistent and don't retrieve all the results. same
> > thing if i try to insert 100+ documents not all the documents are
> > saved.
> > the documents are about 10k in size.
> > is there some kind ofpacketsize limit in mongo/driver?

Nat

unread,
Jan 19, 2011, 8:14:54 AM1/19/11
to mongodb-user
Did you see anything in mongos log file?

Alvin Richards

unread,
Jan 19, 2011, 5:58:30 PM1/19/11
to mongodb-user
What version of the driver are you using?

-Alvin

Eliot Horowitz

unread,
Jan 20, 2011, 12:45:22 AM1/20/11
to mongod...@googlegroups.com
What version of mongo are you using?
There was an issue with limit in mongos at some point.

sirpy

unread,
Jan 20, 2011, 8:28:33 AM1/20/11
to mongodb-user
i'm using version 1.6.5 with java driver 2.3
couldnt see anything special in the mongos log.
Reply all
Reply to author
Forward
0 new messages