iter_size and batch_norm_layer

53 views
Skip to first unread message

康洋

unread,
Sep 12, 2016, 4:09:17 AM9/12/16
to Caffe Users
I am training a ResNet-50 with large iter_size and each batch has 10 images. This configuration almost cost me 6G memory in caffe, so I can not set a larger batch_size. Although large iter_size enable me to train ResNet-50, a batch of 10 images hurts batch normalization. Is there any ideas or examples to save memory cost in caffe so that i use batch normalization in a large deep network ResNet-50. Thanks!
Reply all
Reply to author
Forward
0 new messages