How to fix the parameters in lstm layer?

25 views
Skip to first unread message

吴一鸣

unread,
Jul 4, 2017, 11:32:58 PM7/4/17
to Caffe Users
I trained a network contain the lstm layer. And I want to fix the parameters' value in lstm layer for finetuning. However, I found that parameter in lstm layer could not fixed  using
param{lr_mult: 0}

In the attachment, i set
loss weight: 0
and the loss is actually 0, but the Test loss is change every test step (the test data is same each time). Then I found lstm parameters is need to be backward computation as the second picture shown. But the other layers is not.
finetune.png
lstm.png
loss.png
Reply all
Reply to author
Forward
0 new messages