weighted sigmoid cross entropy loss for caffe

403 views
Skip to first unread message

mini...@gmail.com

unread,
Nov 29, 2016, 6:39:23 AM11/29/16
to Caffe Users
Hi!

Now I am working on one problem of 1000 binary classifications (multiple classification). I am using sigmoid cross entropy loss, but the positive and negative sample ratio is much unbalanced -  the positive ones are much less. So I want to apply the weighted sigmoid cross entropy loss. 

According the formula, I programmed the loss function in caffe. But it cannot converge. I don't know why. 

Do you have any code about the weighted sigmoid cross entropy loss? 

It is grateful if you can share with me. 

Below is my code. It is rough and simple. 

loss[i] = -wp[i%1000]*target[i]*((1-(input_data[i]>=0))*input_data[i]-log(1+exp(input_data[i]-2*input_data[i]* (input_data[i]>=0)))) +
       (1-wp[i%1000])*(1-target[i])* (input_data[i]*(input_data[i]>=0) + log(1 + exp(input_data[i] - 2 * input_data[i] * (input_data[i] >= 0))));

wp is the weights I want add. It is a 1000 long vector. 

soubarna banik

unread,
Nov 14, 2017, 7:32:15 AM11/14/17
to Caffe Users
Hi, I am stuck with same problem. Did your formula work? The loss formula they use is a derived version, and not exactly the original formula for cross entropy loss. So I am a bit skeptical about just multiplying it with weight.
Your reply would be helpful.

xiaohu Bill

unread,
Jan 23, 2018, 11:30:36 PM1/23/18
to Caffe Users
have you solved it, buddy?
Reply all
Reply to author
Forward
0 new messages