I'm trying to iterate through 5 millions records collection by using find(query).iterator(). I have also set the batchSize of the query to 100. The iterator runs through 3 million records and crash out with OutOfMemoryError with following exception stack trace
java.lang.OutOfMemoryError: Java heap space
at java.util.HashMap.resize(HashMap.java:462)
at java.util.HashMap.addEntry(HashMap.java:755)
at java.util.HashMap.put(HashMap.java:385)
at com.google.code.morphia.mapping.cache.DefaultEntityCache.notifyExists(DefaultEntityCache.java:40)
at com.google.code.morphia.mapping.cache.DefaultEntityCache.putEntity(DefaultEntityCache.java:81)
at com.google.code.morphia.mapping.Mapper.fromDb(Mapper.java:472)
at com.google.code.morphia.mapping.Mapper.fromDBObject(Mapper.java:267)
at com.google.code.morphia.query.MorphiaIterator.processItem(MorphiaIterator.java:53)
at com.google.code.morphia.query.MorphiaIterator.next(MorphiaIterator.java:48)
From looking into morphia source code I could see that the entity records are cached, and this cache keeps growing which is probably why the exception. I was wondering if there is any way to get a handle on the cache and flush it periodically