question about evaluation metrics

61 views
Skip to first unread message

sjn...@gmail.com

unread,
Mar 17, 2020, 10:42:25 PM3/17/20
to agriculture-vision
In the submission files, we can only choose one integer label for each pixel.
So I understood that for labels that overlap, choosing either one of the labels gets it correct(true positive).
However, I'm not sure how to compute "prediction" from below equation.

mIOU = (true_positive) / (prediction + target - true_positive).

For example,
if I thought that certain area had two overlapping labels 1 & 2 and I predicted it as 1,
and there actually was ground truth label 2 around that area so I got true positives for label 2(by choosing either one),
then which pixels are computed as "prediction" for label 2(where I didn't predict any label 2)?

I'm a bit confused because since labels have different number of pixels, score might change depending on which label becomes "prediction"(specifically, false postiive).

+ Could you provide the evaluation metrics code?

agriculture-vision

unread,
Mar 18, 2020, 12:58:38 PM3/18/20
to agriculture-vision
Hi,

Thank you for your question.

The modified mIoU is computed using a confusion matrix. As described in our challenge webpage, the confusion matrix is computed as follows:

For each prediction x and label set Y:

If x ⊆ Y, then My,y = My,y + 1 for each y in Y
Otherwise, Mx,y = Mx,y + 1 for each y in Y

The modified mIoU is then calculated by:

(true_positive) / (prediction + target - true_positive), averaged across all classes.

If a pixel has ground truth labels 1 & 2, then a prediction of 1 means that both M1,1 and M2,2 will be incremented by 1. Therefore, choosing either one of the labels gets both ground truth labels correct (instead of just one of them).

The Agriculture-Vision Team
Reply all
Reply to author
Forward
0 new messages