Why the model was changed even if setting blobs_lr = 0?

219 views
Skip to first unread message

Jyh-Jing Hwang

unread,
Sep 15, 2014, 5:06:50 AM9/15/14
to caffe...@googlegroups.com
Greetings,

Thanks for developing such an amazing tool for deep learning.  I'm using Caffe and finetuning the imagenet model for other task.  However, I encountered some problems in finetuning.  More specifically, I cannot improve the performance by finetuning.  Hence, I want to analyze the problem by setting blobs_lr = 0 in all layers except softmax layer, because my program uses only pool 5 features. 

Out of my expectation, the performance was worse rather than unchanged, compared to the one using pre-trained and untuned model.  I don't know what happened during the finetuning process.  I'll be very grateful if there's any suggestion.

Thank you.
Reply all
Reply to author
Forward
0 new messages