Getting "pymongo.errors.WriteError: 24: Too many open files" error

129 views
Skip to first unread message

Pradyot Ghate

unread,
Sep 14, 2017, 3:12:00 AM9/14/17
to mongodb-user
I am trying to move Mongodb server (single) instance from a server running AWS Linux to another server running Debian 9 "Stretch". It contains 100 million odd documents across 80k collections. Everything is running smoothly on AWS Linux, however, the script that is transferring data keeps getting the error - pymongo.errors.WriteError: 24: Too many open files - after 500k or so documents are transferred.

I have tried adjusting ulimit settings, but it still keeps throwing the same error.

Following are the limits on both servers -

AWS
ulimit -Hn = 4096
ulimit -Sn = 1024

Debian 9
ulimit -Hn = 999999
ulimit -Sn = 999999

Also, worth mentioning - I am using the "official" MongoDB Community Edition package on AWS Linux (installed using the documentation provided by MongoDB), while on Debian, installed included package.

shane....@10gen.com

unread,
Sep 18, 2017, 2:35:55 PM9/18/17
to mongodb-user
Hi Pradyot,

How does your data transfer script work? Can you post the code? If the script opens many MongoClient instances you may be creating too many connections to the destination node. Each MongoClient performs connection pooling individually and is thread-safe. You should use a single instance per process and use it with many threads. For more details and recommendations see http://api.mongodb.com/python/current/faq.html#how-does-connection-pooling-work-in-pymongo

If it's not a problem with the script then it might be a ulimit configuration issue, see https://docs.mongodb.com/manual/reference/ulimit/.
Reply all
Reply to author
Forward
0 new messages