Hi!
Now I am working on one problem of 1000 binary classifications (multiple classification). I am using sigmoid cross entropy loss, but the positive and negative sample ratio is much unbalanced - the positive ones are much less. So I want to apply the weighted sigmoid cross entropy loss.
According the formula, I programmed the loss function in caffe. But it cannot converge. I don't know why.
Do you have any code about the weighted sigmoid cross entropy loss?
It is grateful if you can share with me.
Below is my code. It is rough and simple.
loss[i] = -wp[i%1000]*target[i]*((1-(input_data[i]>=0))*input_data[i]-log(1+exp(input_data[i]-2*input_data[i]* (input_data[i]>=0)))) +
(1-wp[i%1000])*(1-target[i])* (input_data[i]*(input_data[i]>=0) + log(1 + exp(input_data[i] - 2 * input_data[i] * (input_data[i] >= 0))));
wp is the weights I want add. It is a 1000 long vector.