Actually I am using accuracy layer that calculates the accuracy of the predictions as the (number of correct predictions)/(number of elements). Since I am working with an unbalanced dataset I'd like to change it for average precision or average recall(Average of the diagonal of the confusion matrix normalized per row) as the metric to evaluate performance. I wanted to know if it can be done in the python wrapper of caffe.
I have read some lines in
Development Issues of how to create a new forward layer, but I prefer to work directly on the wrapper.
Thanks in advance for any help.