Hello, I'm training a net that has in-place ReLu's. I expect for in-place layers the memory use would not increase.
However this is the kind of thing I observe on net init:
I1031 14:26:32.659144 28921 layer_factory.hpp:77] Creating layer conv2
I1031 14:26:32.659160 28921 net.cpp:106] Creating Layer conv2
I1031 14:26:32.659164 28921 net.cpp:454] conv2 <- conv1
I1031 14:26:32.659171 28921 net.cpp:411] conv2 -> conv2
I1031 14:26:32.661525 28921 net.cpp:150] Setting up conv2
I1031 14:26:32.661543 28921 net.cpp:157] Top shape: 4 32 136 241 (4195328)
I1031 14:26:32.661546 28921 net.cpp:165] Memory required for data: 1244256320
I1031 14:26:32.661554 28921 layer_factory.hpp:77] Creating layer conv2/relu
I1031 14:26:32.661563 28921 net.cpp:106] Creating Layer conv2/relu
I1031 14:26:32.661567 28921 net.cpp:454] conv2/relu <- conv2
I1031 14:26:32.661572 28921 net.cpp:397] conv2/relu -> conv2 (in-place)
I1031 14:26:32.661732 28921 net.cpp:150] Setting up conv2/relu
I1031 14:26:32.661741 28921 net.cpp:157] Top shape: 4 32 136 241 (4195328)
I1031 14:26:32.661743 28921 net.cpp:165] Memory required for data: 1261037632
Why does the memory required for data increase after an in-place ReLu layer? shouldn't conv2/relu use the same blob as conv2, so no more memory needs to be allocated?
Thanks.