How can I manually set the output gradient of LSTM and then call backward.

32 views
Skip to first unread message

nelson

unread,
Sep 12, 2018, 10:09:42 AM9/12/18
to DyNet Users
Dear All

I am implementing the CRF with C++ and want to use LSTM as the neural part. Hence, I need to pass the gradient calculated by CRF and then pass it to the backward of computation graph.  but the parameter of backward function is the loss expression. How can I manually set the output gradient of LSTM and then call backward?

Thank a lot!

Graham Neubig

unread,
Sep 12, 2018, 10:30:01 AM9/12/18
to nangu...@gmail.com, DyNet Users
The easiest way is to input the loss value, then multiply it by the
output of the LSTM:

output = lstm_func()
grad = ... # create a numpy array with the gradient
loss = output * input(grad)
loss.backward()

You can also just implement the CRF dynamic program in DyNet, which
would reduce the probability of bugs.

Graham
> --
> You received this message because you are subscribed to the Google Groups "DyNet Users" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to dynet-users...@googlegroups.com.
> To post to this group, send email to dynet...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/dynet-users/ae49e1f0-a700-402f-a10b-fa9639b7fcd9%40googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.
Reply all
Reply to author
Forward
0 new messages