Hi,
Since this question has been posted sometime ago, have you found a way to solve the issues you were seeing?
I am trying to use batch inserts on 10K records, the size of one record about 1 KB. All great, but 60-70 million such records memory ends.
Could you please elaborate on:
Also, could you please clarify your statement a bit:
Mongo is limited cache in 3GB, but the memory is consumed by memory map files collections and indexes
After some time Mongo CPU load at 100% and stops receiving data.
Could you post the output of diagnostic tools (e.g. mongostat) and the relevant mongod
logs ?
Regards,
Pooja
Since this question has been posted sometime ago, have you found a way to solve the issues you were seeing?
- how many records do you have in total? Are you doing batch inserts on 10k records at a time?
- What method have you adopted for doing batch inserts (mongorestore/ mongoimport/ custom code etc.)?
- What method did you use to limit the cache size?
Could you post the output of diagnostic tools (e.g. mongostat) and the relevant mongod
logs ?
Hi
Sorry for the delay in responding. What MongoDB version are you using? Is it possible for you to upgrade to the latest in the 3.2 series (currently 3.2.10) and see if the issue is still there? It also may be worth trying to import using mongoimport
and see if the import is proceeding without pauses.
Best regards,
Kevin