You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to DyNet Users
Hi,
I am a new user of dynet, so please excuse me if I write something obvious to the others.
I have a couple of question about a training that I am trying to do.
I am creating a 2 classes classifier with a mlp feedforward net based on the mnist example.
- Given I have unbalanced input data I would like to know if there is a method to weight the classes when doing backpropagation (similar to the class_weight param you can use inside keras wen doing model.fit).
Also I am using the pickneglogsoftmax (postprocessed via sum_batches) for estimating the loss, while in keras I was using the categorical_crossentropy loss function, is it possible to use something more similar to the categorical_crossentropy in dynet?
Thank you very much in advance!
Emanuele.
Graham Neubig
unread,
Aug 30, 2018, 10:44:19 PM8/30/18
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to emanuele...@gmail.com, DyNet Users
(Trying one more time because my first mail bounced)
There's not an explicit method for this, but one way to do it would be to do:
class_weights = ... # Create numpy array of weights per batch member
class_weights = dy.inputTensor(class_weights, batched=True)
log_loss = dy.pickneglogsoftmax_batch(scores, labels)
weighted_log_loss = class_weights * log_loss