Vgg for CLS softmax normalize and results alwayes 1 or 0 after finetunning

11 views
Skip to first unread message

Lemma

unread,
Nov 6, 2016, 3:31:55 PM11/6/16
to Caffe Users
Hi all,

I am tackling the following two interesting issues, when fine tuning VGG-16 layer CLS  model:
1. When testing the model after fine tuning,  the last softmax layer returns results which are 1 or 0  , 
when checking the precision of the softmax results  it seems that , it's float number.and in my case the precision is 4 digits.  Taking a larger precision of the available float, it seems that the '0', is actually more like 1.5768Exp-12 in that range and so on. The ranking is actually done on all the available floating point precision.
 when summing the numbers in such a high precision 142 digits , i got 1 +Epsilon, more than 1. What do you think?, it seems to me as  floating point phenomena.
The softmax formula impose precision according to N- which is the number of classes, in my case, 8 classes, which is 0.125 for each class, what do you think?
I have 8 classes and my training results are 97% with 0.04 loss, which decreasing steadily.
2.As already mentioned i got one single class with 1, and all the other close to Zero,
Testing the model with unseen , completely  clean test set   give 97% accuracy, i have the impression there is no overfitting issue here even it's transfer training based on VGG.
I know the overfitting concern the prilimanry guidelines of Cs231 for tranfer training, my case is more compliant/approperate for SVM-VGG based model, but, i succeeded with the original VGG topology with slightly modification and appropriate mlt_lrs, etc'
Do you thing the issue that i am not getting spread results [1-0] is it unreported indicator of  of overfitting?  i am not sure.

I have the suspicious that it is relate to normalize the images and make it zero var' or,  shuffling, but regarding the shuff' the caffe framework already take care of it, may be something with the zero's var.

Appreciate your thoughts.

Reply all
Reply to author
Forward
0 new messages