ok some updates.
i've attached a sample code, which is very basic.
i've made the following tests.
tried with batchsize 500,stops after 612 objects.
tried with batchsize 50 works fine.
the collection is sharded so i've also tried querying one of the
shards instead of querying the mongos(on my localhost port 10000) with
batch size 500.
and it works!
so maybe the bug is in the sharding environment!.
Mongo db = new Mongo("localhost", 22222);
DB queries = db.getDB("queries");
DBCollection col = queries.getCollection("basicQueryObjects1901");
DBCursor res = col.find().batchSize(500).limit(10000);
int count=0;
while(res.hasNext())
{
count++;
DBObject obj = res.next();
System.out.println(count);
> Can you send your code, state before and and after, anddriverversion #
>
>
>
>
>
>
>
> On Sat, Jan 15, 2011 at 2:44 PM, sirpy <
had...@gmail.com> wrote:
> > i'm trying to insert/read documents in batches using thejavadriver,
> > and i've noticed that if I use batch size of 100+ to read documents
> > the results are inconsistent and don't retrieve all the results. same
> > thing if i try to insert 100+ documents not all the documents are
> > saved.
> > the documents are about 10k in size.
> > is there some kind ofpacketsize limit in mongo/driver?