can we finetune weights selectively, not layer wise

17 views
Skip to first unread message

MD AMIR SOHAIL

unread,
Jul 24, 2018, 7:16:38 AM7/24/18
to Caffe Users
I want to train some weights within a layer selectively, not layerwise. I will have a binary file which will indicate whether to finetune a specific weight or not. Is there any optimization for that ? Or any python code that can do that job ?

Przemek D

unread,
Aug 22, 2018, 10:56:27 AM8/22/18
to Caffe Users
You want to only update specific elements of the parameter blob, correct? For example, only some slices of only some convolution kernel?
If so, I'm afraid you'd have to do this manually. The easiest hack would be to carry out the whole training in Python manually (i.e. forward and backward pass and then apply gradients using the newly added apply_update method), but store the values you want to remain constant and manually restore them after every parameter update. Does this have a specific use case? This general strategy seems pretty inefficient, but maybe we could work around this for your particular problem?
Reply all
Reply to author
Forward
0 new messages