If you only want it to pull data from the DB when you get the next document from the cursor, then modify the batch size to a smaller number. I think the default is 0 so it let's the server decide. In that case, the server will send as many documents as will fit in 4MB (I think).
cr
> --
> You received this message because you are subscribed to the Google Groups "mongodb-user" group.
> To post to this group, send email to mongod...@googlegroups.com.
> To unsubscribe from this group, send email to mongodb-user...@googlegroups.com.
> For more options, visit this group at http://groups.google.com/group/mongodb-user?hl=en.
>
>
In MSSQL, a query results in a Stream, then I can iterate this stream
one by one, adding it to mongo (only 1 object is loaded into memory in
a given time).
For your case, first you must be sure you really need 50 000 objects
(if you`re showing them on a page, 50 000 are far too much to
display).
If you really need them all, you can use something like pagination:
get the first 10 and iterate over them. Then get the next 10 and so
far, until you reach the end of collection. It's just like reading a
big file from a file stream (or memory mapped files): you never load
the entire file into the memory.
This way you have only 10 objects in memory at a given time.
BTW, this is how cursors in mongo works, as far I know: a query never
returns thousands of objects. Instead, it gives you some of them and
you can query the next ones in the queue when needed (or killing the
cursor if you don't want more data).
> To unsubscribe from this group, send email to mongodb-user...@googlegroups.com.
> For more options, visit this group at http://groups.google.com/group/mongodb-user?hl=en.
>
>
--
[]
Júlio César Ködel G.
"Todo mundo está ""pensando"" em deixar um planeta melhor para nossos filhos...
Quando é que se ""pensará"" em deixar filhos melhores para o nosso planeta?"