How can I manually set the output gradient of LSTM and then call backward.
32 views
Skip to first unread message
nelson
unread,
Sep 12, 2018, 10:09:42 AM9/12/18
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to DyNet Users
Dear All
I am implementing the CRF with C++ and want to use LSTM as the neural part. Hence, I need to pass the gradient calculated by CRF and then pass it to the backward of computation graph. but the parameter of backward function is the loss expression. How can I manually set the output gradient of LSTM and then call backward?
Thank a lot!
Graham Neubig
unread,
Sep 12, 2018, 10:30:01 AM9/12/18
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to nangu...@gmail.com, DyNet Users
The easiest way is to input the loss value, then multiply it by the
output of the LSTM:
output = lstm_func()
grad = ... # create a numpy array with the gradient
loss = output * input(grad)
loss.backward()
You can also just implement the CRF dynamic program in DyNet, which
would reduce the probability of bugs.