Loss Stuck at Constant Amount ~ Step 3000

81 views
Skip to first unread message

Juliette Starling

unread,
Sep 1, 2016, 7:53:35 AM9/1/16
to Caffe Users
I'm training a vanilla Inception network on Caffe and the training seems to fare well, but suddenly at step ~3000, it gets "stuck" at an identical loss amount.  If the loss was asymptotic, that would be one thing, but here it is *exactly* horizontal identical for all steps afterwards.  Is this typical or a sign of some problem?

신승원

unread,
Sep 3, 2016, 9:06:35 PM9/3/16
to Caffe Users
The parameters seem to be stuck at local minimum, I think.
You can try finetuning with less learning rate, and if the loss stays the same, you should do something new - for example, training a new(random-initialized) network with different learning method and/or rate.

A few days ago, my loss was stuck too, and the loss was not satisfying much. I needed much less loss.
I had been using SGD and AdaGrad with different learning rates, which didn't solve the problem.
Then I tried Adam. The loss got higher, and I thought it was definitely wrong, but in around epoch 14, the loss just suddenly got much lower, lower than the minimum loss achieved with SGD. This is a small and specialized example.

Reply all
Reply to author
Forward
0 new messages