I am trying to run my network on the Imagenet dataset. My problem is that my program crashes, when I run it on the entire dataset. In fact the memory usage increases until one epoch is finished and then it stops increasing.
I tried reducing the batch size, but that does not change the memory usage. If I change the batch size from 256 to 128, the program will crash at iteration 1600 instead of 800 (for 256).
I read a thread that says "The used memory is proportional to prefetch * batch_size".
Is for ImageDataLayer of lmdb? Right now I am using lmdb, the program will not crash with imagedata layer, but it would be too slow.
Is it possible to change the prefetch variable for the caffe configuration, after it has been installed?