Dynamically freezing parts of net during training

3 views
Skip to first unread message

Swami

unread,
Feb 13, 2017, 12:36:27 PM2/13/17
to Caffe Users
I am wondering if it is possible (using pycaffe) to dynamically set/modify lr_mult parameters for different layers during training. Any help would be really appreciated.
Reply all
Reply to author
Forward
0 new messages