I need cumsum, but Knet autograd doesn't seem to support it

19 views
Skip to first unread message

Sinan Cem Yücel

unread,
Jan 19, 2023, 4:49:06 PM1/19/23
to com...@ku.edu.tr

Hi,

 

I need to use a cumsum operation on a 4D tensor, there are no problems in forward pass, but the Knet autograd gives an error on the  backward pass for cumsum. I can code it with a for loop but I am guessing that would be slow. What should I do?

 

Thanks.

 

Deniz Yuret

unread,
Jan 19, 2023, 11:38:56 PM1/19/23
to Sinan Cem Yücel, com...@ku.edu.tr
You can define the gradient for cumsum (or any new operator) in AutoGrad. Please contact me if you need help after (1) reading the autograd/Knet documentation on the subject, (2) figuring out what the gradient should be mathematically.

best,
deniz


--
You received this message because you are subscribed to the Google Groups "COMP541" group.
To unsubscribe from this group and stop receiving emails from it, send an email to COMP541+u...@ku.edu.tr.
To view this discussion on the web visit https://groups.google.com/a/ku.edu.tr/d/msgid/COMP541/DD8A6DB6-DAE5-4E70-96D4-7C6B6685C8BF%40hxcore.ol.
Reply all
Reply to author
Forward
0 new messages