You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to Caffe Users
I trained a network contain the lstm layer. And I want to fix the parameters' value in lstm layer for finetuning. However, I found that parameter in lstm layer could not fixed using
param{lr_mult:0}
In the attachment, i set
loss weight:0
and the loss is actually 0, but the Test loss is change every test step (the test data is same each time). Then I found lstm parameters is need to be backward computation as the second picture shown. But the other layers is not.