Hello all,
While finetuning a model, I noticed a curious thing - even when I set all the blobs_lr and weight_decay parameters to 0 for all layers, the loss and accuracy still change. After simply having run it for about 5k iterations, I noticed that the accuracy wasn't changing much from the initial value.
Intial acc = 64.2 %
After 1k iterations = 63.55%
After 2k iterations = 63.7%
After 3k = 64%
After 4k = 63.7% ...etc.
I even removed dropout and checked - no change in behaviour.
Has anyone encountered this kind of behaviour before? If so, can anyone tell me why this may be happening?