Hi,
I've trained the VGG architecture using fine-tuning with 1500 iterations, during testing I got an accuracy of 0.997.
However, I tried to get a confusion matrix by using a python script but the results are nothing similar to what I was expecting to:
Positive Negative
Positive 0 342
Negative 0 342
I always get the Negative class both for my training and validating data.
I am not expecting to get the 0.997 accuracy for the confusion matrix but for me is quite obvious there is something wrong during the prediction as I got an 80% in the confusion matrix for the GoogLeNet architecture using the same script.
In the python script I initialize the transformer as:
net = caffe.Net('models/' + folderToTrain +'/deploy.prototxt', 'models/' + folderToTrain + '/' + fileToTest, caffe.TEST)
transformer = caffe.io.Transformer({'data': net.blobs['data'].data.shape})
transformer.set_transpose('data', (2,0,1))
transformer.set_mean('data', np.array([103.939, 116.779, 123.68]))
transformer.set_channel_swap('data', (2,1,0))
net.blobs['data'].reshape(50,3,224,224)
Then, I iterate over the array of files to be predicted using:
for i in indexes:
f = files[i]
net.blobs['data'].data[...] = transformer.preprocess('data', caffe.io.load_image(join(newPath, f)))
out = net.forward()
result = out['prob'].argmax()
if directory.find('positive') != -1:
if result == 1:
positivePositives += 1
else:
positiveNegatives += 1
else:
if result == 1:
negativePositive += 1
else:
negativeNegatives += 1
#Show results
print "positivePositives: {}".format(positivePositives)
print "positiveNegatives: {}".format(positiveNegatives)
print "negativePositive: {}".format(negativePositive)
print "negativeNegatives: {}".format(negativeNegatives)
accTrain = ((positivePositives + negativeNegatives) * 100) / (positivePositives + positiveNegatives + negativePositive + negativeNegatives)
print "Accuracy: {}".format(accTrain)
I think my bad is in the transformer but I can't figure it out.
Does anybody has been able to predict using the VGG?
By the way, I am using the VGG 19-layers
Thanks in advance,