Accuracy per class?

瀏覽次數:303 次
跳到第一則未讀訊息

Jose Carranza

未讀,
2017年8月30日 中午12:19:222017/8/30
收件者:Caffe Users
Hi guys

I noticed there was a PR that attempted to do this. Was this ever merged on the main trunk? if so, what is the way to use it?

Thanks in advance for any info about it :)

Joseph Young

未讀,
2017年8月30日 下午1:38:362017/8/30
收件者:Caffe Users
I'm able to get per class accuracies by adding a second top to my accuracy layer

Jose Carranza

未讀,
2017年8月30日 下午2:15:452017/8/30
收件者:Caffe Users
I see, do you have an example of this? I use a second for top-5 but no idea how to tell caffe to do per class accuracy :S

Joseph Young

未讀,
2017年8月30日 下午5:34:042017/8/30
收件者:Caffe Users
I did as they said here, however not using DIGITS version of caffe: https://github.com/NVIDIA/DIGITS/issues/506

My accuracy layer is as follows:

layer {
    name
: "Loss3/Accuracy_Accuracy"
    type
: "Accuracy"
    bottom
: "Loss3/InnerProduct_Loss_fc1"
    bottom
: "label"
    top
: "Loss3/Accuracy_Accuracy"
    top
: "class_accuracies"
    include
{
        phase
: TEST
   
}
}

This produces an output like this (for 4 classes):

I0830 22:29:43.862030  8673 caffe.cpp:330] Loss3/Accuracy_Accuracy = 0.68421
I0830
22:29:43.862037  8673 caffe.cpp:330] Loss3/SoftmaxWithLoss_Loss = 1.15139 (* 1 = 1.15139 loss)
I0830
22:29:43.862042  8673 caffe.cpp:330] class_accuracies = 0.949666
I0830
22:29:43.862048  8673 caffe.cpp:330] class_accuracies = 0.681495
I0830
22:29:43.862053  8673 caffe.cpp:330] class_accuracies = 0.550313
I0830
22:29:43.862059  8673 caffe.cpp:330] class_accuracies = 0.563993

Jose Carranza

未讀,
2017年8月30日 下午6:08:022017/8/30
收件者:Caffe Users
Got it. Is there a way to pick up the class names or are the results appearing in the same order/indexing as the class files one provides to the LMDB scripts?
訊息已遭刪除
訊息已遭刪除

Przemek D

未讀,
2017年8月31日 凌晨2:44:352017/8/31
收件者:Caffe Users
Caffe is unaware of your class names, but the accuracy per class results are indexed in the same way you indexed classes in your LMDB.

Jose Carranza

未讀,
2017年9月18日 晚上7:08:302017/9/18
收件者:Caffe Users
OK. Final question, does Caffe keep track of the number of images per class? In my case I'm getting an overall accuracy of 90% however the classes get things like 1% or less. How does it do the calculation? or am I doing something wrong? 

Przemek D

未讀,
2017年9月19日 凌晨2:32:262017/9/19
收件者:Caffe Users
It does count number of images per class, but this is an internal variable to which you don't have access outside the layer code. The calculation is roughly as follows:
count_examples = 0
count_hits = 0
count_examples_perclass = vector(length=num_classes, initial_value=0)
count_hits_perclass = vector(length=num_classes, initial_value=0)
for each instance:
  count_examples += 1
  count_examples_perclass[ground_truth] += 1
  if predicted_label = ground_truth:
    count_hits += 1
    count_hits_perclass[ground_truth] += 1
accuracy = count_hits/count_examples
accuracy_perclass = count_hits_perclass/count_examples_perclass //element-wise vector division, for brevity

If you've got issues with accuracy, try this pull request: https://github.com/BVLC/caffe/pull/5836
It greatly improves and optimizes the calculation, and includes a GPU implementation.
回覆所有人
回覆作者
轉寄
0 則新訊息