Per class loss normalization for softmax layer in caffe for FCNs ?

980 views
Skip to first unread message

Tarun Sharma

unread,
Aug 14, 2016, 10:08:24 PM8/14/16
to Caffe Users
Hello,

      For the FCN (fully convolutional networks), I want to be able to normalize the softmax loss, for each class,  by the number of pixels of that class in the ground truth. Is there an implementation anyone has done for this ?


Thank you,
Tarun Sharma

Ruud

unread,
Sep 8, 2016, 2:02:05 PM9/8/16
to Caffe Users
Dear Tarun,

I have the same question! Did you manage to find a solution?

Thanks,
Ruud

xdtl

unread,
Sep 9, 2016, 12:43:11 PM9/9/16
to Caffe Users
Do you mean setting the normalization parameter in the loss layer as below?

layer {
  name: "loss"
  type: "SoftmaxWithLoss"
  bottom: "score"
  bottom: "label"
  top: "loss"
  loss_param {
    normalize: true
  }
}


On Sunday, August 14, 2016 at 10:08:24 PM UTC-4, Tarun Sharma wrote:

Juan Carlos Leon Alcazar

unread,
Oct 20, 2016, 3:23:18 PM10/20/16
to Caffe Users
Isn't 'true'  the default setting on that parameter?

nila...@gmail.com

unread,
Oct 21, 2016, 10:30:41 AM10/21/16
to Caffe Users
It should be possible to achieve this by introducing per class weights into the loss function. The weights could be calculated to represent the amount of pixels in the ground truth for each class. For sourcecode look at https://bitbucket.org/deeplab/deeplab-public/src/26b6af817e4676f6409aa0d609096b31ff72917d/src/caffe/layers/softmax_loss_layer.cpp?at=master , follow what happens with the weight_source parameter.

Regards
Nilas

Ruud

unread,
Dec 27, 2016, 6:49:31 AM12/27/16
to Caffe Users

Alex Ter-Sarkisov

unread,
Sep 11, 2017, 2:54:53 PM9/11/17
to Caffe Users
You should use the InfogainLoss layer. It fixes the class imbalance, especially for pixel-level predictions. 
Reply all
Reply to author
Forward
0 new messages