Hi,
I want to use MultinomialLogisticLoss (cross-entropy) with softmax activations (which is usually called softmax loss) for multilabel classification.
I know that intuitively it doesn't make sense to use softmax activations for a multi-label problem, but Facebook stated is his paper "Exploring the Limits of Weakly Supervised Pretraining" that they work better than Sigmoid activations.
But I have found that either the MultinomialLogisticLoss and the SoftMaxWithLoss layers require integers (class labels indices) as targets, while I need to use real numbers. The only cross-entropy loss layer using real numbers targets is the SigmoidCrossEntropyLoss, but I don't want Sigmoid activations, but Softmax.
¿Is there any solutions to do that with the default Caffe?
Thanks