Res-Net: can BatchNorm and Scale layer be in-place?

608 views
Skip to first unread message

Nam Vo

unread,
Mar 11, 2016, 2:42:53 PM3/11/16
to Caffe Users
Hey, I want to do some fine-tune of the Residual Network caffe version released by MSRA.
However there's not many examples in caffe showing how to use BatchNorm and Scale layer.
More specifically I want to ask if these layers can be in-place (bottom & top blobs are the same), anyone has experience on this?

Stas Sl

unread,
Mar 17, 2016, 6:15:41 AM3/17/16
to Caffe Users
Same question. I see some people use both layers in-place, while others use them separately.

Nam Vo

unread,
Mar 18, 2016, 2:00:27 PM3/18/16
to Caffe Users
I was able to use msra prototxt to train, so it's possible for BatchNorm and Scale to be in place.

Minh Duc

unread,
Mar 26, 2016, 6:20:12 PM3/26/16
to Caffe Users
Hi Nam,

I have also tried it. Although my network still learns with in-place batchnorm and scale layers, I did not see any reduction in memory consumption, which is the main motivation of using in-place layers. How about your case? 

Hossein Hasanpour

unread,
Jun 5, 2016, 5:04:03 AM6/5/16
to Caffe Users
Can some one elaborate on this more? 
Can scale and batch norm layers be used in place now? 

Leon

unread,
Jul 13, 2016, 9:46:06 PM7/13/16
to Caffe Users
Same question.

在 2016年3月12日星期六 UTC+8上午3:42:53,Nam Vo写道:
Reply all
Reply to author
Forward
0 new messages