InfogainLossLayer doesn't work

26 views
Skip to first unread message

Henry

unread,
Jul 27, 2017, 6:02:13 AM7/27/17
to Caffe Users
Hi all

I had very imbalanced training data and I plan to use the InfogainLoss layer to solve it.  The code is shown below:

n.softmax_score = L.Softmax(n.score)

n.loss = L.InfogainLoss(n.softmax_score, n.label,
            loss_param=dict(normalization=True, ignore_label=0),
            infogain_loss_param=dict(source="infogainH.binaryproto"))

The values in the diagonal line of tinfogainH.binaryproto matrix are 1, 1, 1, 2, 2, 4, 1, 1 ............. and the values in other position are 0. But during training, the loss doesn't decrease. When I use SoftmaxWithLoss layer, the loss decrease very quickly.

I changed the infogainH.binaryproto matrix to an identity matrix. But the loss still doesn't decrease. The official document of InfogainLoss layer shows that "If H = I, this layer is equivalent to the MultinomialLogisticLossLayer." 

Any idea where is my mistakes? Many thanks for your help!

Best
Henry




Henry

unread,
Aug 7, 2017, 10:56:05 AM8/7/17
to Caffe Users
bump!
Reply all
Reply to author
Forward
0 new messages