Memory leak with Mongo Java Connection

2,032 views
Skip to first unread message

Amar Sharma

unread,
Jun 17, 2015, 2:14:58 AM6/17/15
to mongod...@googlegroups.com

I am constructing MongoClient Connection in the below manner :

public static synchronized MongoClient getInstance(String mongoDbUri) {
        try {
            // Standard URI format: mongodb://[dbuser:dbpassword@]host:port/dbname
            if( mongoClient == null ){
                mongoClient = new MongoClient(
                              new MongoClientURI(mongoDbUri));
            }
        } catch (Exception e) {
            log.error(
                    "Error mongo connection : ",
                    e.getCause());
        }
        return mongoClient;
    }
Over a period of time when multiple transaction is run I am seeing some memory eat up with the application which is not getting released.

When analysed the heap dump saw that there was memory consumption was maximum with the class

com.mongodb.internal.connection.PowerOfTwoBufferPool

The mongo client is trying to connect to a mongos instance.The application has 3 replica sets on 3 shards and one config server to hold the metadata.

To add more details to the same , I have a spring managed bean annotated with @Component.There is an annotation with @PostConstruct for the bean in which the above method is called.In the spring class we are doing insert/update/create using the Mongo Client.

I am kind of stuck with this issue last 2 days. Any pointers/approach to resolve will mean a lot to  me.Thanks.

Justin Lee

unread,
Jun 17, 2015, 8:25:21 AM6/17/15
to mongod...@googlegroups.com
Are you closing your cursors properly?

--------------------------------

name     : "Justin Lee", 
  title    : "Software Engineer",
  twitter  : "@evanchooly",
  web      : [ "10gen.com", "antwerkz.com" ],
  location : "New York, NY" }

--
You received this message because you are subscribed to the Google Groups "mongodb-user"
group.
 
For other MongoDB technical support options, see: http://www.mongodb.org/about/support/.
---
You received this message because you are subscribed to the Google Groups "mongodb-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to mongodb-user...@googlegroups.com.
To post to this group, send email to mongod...@googlegroups.com.
Visit this group at http://groups.google.com/group/mongodb-user.
To view this discussion on the web visit https://groups.google.com/d/msgid/mongodb-user/97f415fe-4059-4134-a68f-c6a1fd55dd9e%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Amar Sharma

unread,
Jun 17, 2015, 8:54:40 AM6/17/15
to mongod...@googlegroups.com
Yes Justin I am closing the cursor.

Jeff Yemin

unread,
Jun 17, 2015, 2:41:30 PM6/17/15
to mongod...@googlegroups.com
Hi there,

Can you provide some statistics about how memory usage changes over time?  Does it eventually reach a steady state or does it keep increasing until the application runs out of memory? 

I can tell you off the bat that PowerOfTwoBufferPool is the driver's buffer pool cache.  The driver caches large byte buffers in order to avoid fragmentation of the heap, so it's expected that the driver retains some memory over time.


Regards,
Jeff

On Wed, Jun 17, 2015 at 12:52 AM, Amar Sharma <amarsh...@gmail.com> wrote:

--

Amar Sharma

unread,
Jun 25, 2015, 1:42:11 PM6/25/15
to mongod...@googlegroups.com
Hi Jeff .. 

Thanks for the detail.What i see is the ram usage keeps on increasing and then we end OOM in the application.We recently migrated the data from MySQL to Mongo ..There is a subdocument in the documents which increases the size of the document.So in java we are using the jackson mapper to convert the json to java object.

With each iteration of the dbcursor we tried explicitily referring to run in the last line of the cursor iteration.However that helped us to some extent but we still are getting OOM

Amar Sharma

unread,
Jun 25, 2015, 1:45:09 PM6/25/15
to mongod...@googlegroups.com
With each iteration of the dbcursor we tried explicitily to set the DBobject to null in the last line of the cursor iteration.However that helped us to some extent but we still are getting OOM.Do we need to do anything with the DBCursor fetch.We have current base of 1miliion document.Each document has sub-documents which are large enough (4MB)


On Wednesday, June 17, 2015 at 11:44:58 AM UTC+5:30, Amar Sharma wrote:

Jeff Yemin

unread,
Jun 25, 2015, 2:46:47 PM6/25/15
to mongod...@googlegroups.com
Hi Amar,

It's difficult to diagnose these issues from afar without access to the full application, as the driver is just one piece of the application.  So please allow me to ask a few follow-up questions:
  1. Is your application multi-threaded, and if so, can you estimate the maximum concurrency level (in other words, how many threads are concurrently executing MongoDB operations)?
  2. What is the maximum heap size (e.g. -Xmx) configured for the JVM and have you tried simply increasing it to determine if doing so avoids OOM issues. 
  3. Can you share the heap dump?  If you don't want to do this publicly, we can arrange a way to get it from you privately.

Regards,
Jeff

Amar Sharma

unread,
Jun 30, 2015, 2:08:54 AM6/30/15
to mongod...@googlegroups.com
Thanks Jeff.
The heap size is 3GB ram.I will get new dump and will get in touch with you.

Is the memory sufficient. Because we are reading huge mongo data.(usually 4MB ) 


On Wednesday, June 17, 2015 at 11:44:58 AM UTC+5:30, Amar Sharma wrote:
Reply all
Reply to author
Forward
0 new messages