1. Shutdown Hadoop. Restarted HazelCast server. Ran Garbage Collection (GC) using mancenter. Created a Heap Dump . See the following summary from JVisualVM while analyzing the HPROF file
Total bytes: 49,040,741
Total classes: 2,685
Total instances: 235,562
Classloaders: 14
GC roots: 1,505
Number of objects pending for finalization: 0
2. Restart Hadoop. Ran a job and created a heap dump
Total bytes: 96,102,606
Total classes: 2,929
Total instances: 393,400
Classloaders: 15
GC roots: 1,529
Number of objects pending for finalization: 0
3.
Shutdown Hadoop. Create a heap dump
Total bytes: 136,635,654
Total classes: 2,929
Total instances: 1,432,769
Classloaders: 15
GC roots: 1,529
Number of objects pending for finalization: 0
4.
Run GC. Create a heap dump
Total bytes: 16,213,645
Total classes: 2,929
Total instances: 137,577
Classloaders: 15
GC roots: 1,529
Number of objects pending for finalization: 0
The first test is to see the objects in the Heap when the HazelCast server has run no jobs. So there would no maps and entries. If you analyze the heap this is true.
The second test is to run a Job and check the objects in Heap. There are 2 objects that are of interest in the heap namely com.hazelcast.spi.DefaultObjectNamespace and com.hazelcast.concurrent.lock.LockStoreImpl.
The third test is to see if Hadoop is holding on to these objects. So I shutdown Hadoop and created a heap dump. Those objects are still there in the Heap.
The fourth test is to see if running GC clears them from Heap. From analyzing the heap, we can see that it cleared some of the objects, but a majority are still in memory.
Thanks,
Praveen Gautam