Hi,
I am trying to perform a multi-class classification. Ideally I would use a cross entropy loss to train my neural network. However, my classes are Ordinal variables. Hence, I would what my loss function to enforce some sort of order in the prediction. For example y_true = 2, then I would prefer y_predict = 3 rather than y_predict = 4. For this, I am thinking of using a custom loss function with a combination of Cross entropy loss and mean_absolute_loss after a softmax layer:
import from keras import backend as K
from keras import losses
loss_weight = [1,0.0001]
loss_weight_tensor = K.variable(value=loss_weight)
def custom_loss(y_true,y_pred):
l1 = K.sparse_categorical_crossentropy(y_true,y_pred)
y_pred_argmax = K.cast( K.argmax(y_pred,axis=1),dtype=K.tf.float32) # y_pred_argmax get the class from softmax output
l2 = losses.mean_absolute_error(y_pred_argmax, y_true)
return l1*loss_weight_tensor[0] + l2*loss_weight_tensor[1]
Is there a fallacy in my thinking or construction of this loss function. Does it look like it is a valid loss function (piecewise-differentiable, etc ) given i am using argmax? And do you think tensorflow backend will calculate a valid gradient ? Or are there any better alternative to achieve an ordinal classification?
Thanks,
Adit
--
You received this message because you are subscribed to the Google Groups "Keras-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to keras-users+unsubscribe@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/keras-users/CAA4bTr8iJqwEN4BAfLKxogukg_OaYWnKOBn3UFbgwB2%3D6GdKcw%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.
To unsubscribe from this group and stop receiving emails from it, send an email to keras-users...@googlegroups.com.
To unsubscribe from this group and stop receiving emails from it, send an email to keras-users+unsubscribe@googlegroups.com.