MongoDB consumes all memory

107 views
Skip to first unread message

Алексей Гаранин

unread,
Sep 27, 2016, 5:56:32 PM9/27/16
to mongodb-user
Hi guys

I use MongoDB 3.2, WiredTiger engine. I am trying to use batch inserts on 10K records, the size of one record about 1 KB. All great, but 60-70 million such records memory ends. Mongo is limited cache in 3GB, but the memory is consumed by memory map files collections and indexes. After some time Mongo CPU load at 100% and stops receiving data. What am I doing wrong? :)

OS: Windows 7
RAM: 8GB
CPU: 4 core

Алексей Гаранин

unread,
Sep 29, 2016, 3:59:58 AM9/29/16
to mongodb-user
https://blog.clevertap.com/sleepless-nights-with-mongodb-wiredtiger-and-our-return-to-mmapv1/ :D

среда, 28 сентября 2016 г., 0:56:32 UTC+3 пользователь Алексей Гаранин написал:

Pooja Gupta

unread,
Oct 26, 2016, 3:43:27 AM10/26/16
to mongodb-user

Hi,

Since this question has been posted sometime ago, have you found a way to solve the issues you were seeing?

I am trying to use batch inserts on 10K records, the size of one record about 1 KB. All great, but 60-70 million such records memory ends.

Could you please elaborate on:

  • how many records do you have in total? Are you doing batch inserts on 10k records at a time?
  • What method have you adopted for doing batch inserts (mongorestore/ mongoimport/ custom code etc.)?

Also, could you please clarify your statement a bit:

Mongo is limited cache in 3GB, but the memory is consumed by memory map files collections and indexes

  • What method did you use to limit the cache size?
  • How did you determine the memory consumption?

After some time Mongo CPU load at 100% and stops receiving data.

Could you post the output of diagnostic tools (e.g. mongostat) and the relevant mongod logs ?

Regards,
Pooja

Алексей Гаранин

unread,
Oct 26, 2016, 11:19:41 AM10/26/16
to mongodb-user
Since this question has been posted sometime ago, have you found a way to solve the issues you were seeing?
No
  • how many records do you have in total? Are you doing batch inserts on 10k records at a time?
~ 40 million. Yes,  batch inserts on 10k records.

  • What method have you adopted for doing batch inserts (mongorestore/ mongoimport/ custom code etc.)? 

  • What method did you use to limit the cache size?
wiredTiger.engineConfig.cacheSizeGB

Could you post the output of diagnostic tools (e.g. mongostat) and the relevant mongod logs ?
attached 

среда, 26 октября 2016 г., 10:43:27 UTC+3 пользователь Pooja Gupta написал:
mongo_OOM_2.zip

Kevin Adistambha

unread,
Nov 17, 2016, 6:28:26 PM11/17/16
to mongodb-user

Hi

Sorry for the delay in responding. What MongoDB version are you using? Is it possible for you to upgrade to the latest in the 3.2 series (currently 3.2.10) and see if the issue is still there? It also may be worth trying to import using mongoimport and see if the import is proceeding without pauses.

Best regards,
Kevin

Reply all
Reply to author
Forward
0 new messages