Joint gradient and function evaluation

13 views
Skip to first unread message

Paul vdb

unread,
Feb 27, 2026, 7:16:09 PMFeb 27
to TMB Users
I'm not sure I understand ADjoint enough to do the following.

I'm interested in extending expAv in RTMB by jointly computing the gradient at the same time as the function call. I think this might improve speed as it would avoid needing to retape during optimization as the number of iterations needed to meet a level of accuracy changes.

See Algorithm 2:  Differentiated uniformization: a new method for inferring Markov chains on combinatorial state spaces including stochastic epidemic models"

The core piece is that I need to compute the gradient and the function at the same time, not in two separate functions, and then return them jointly. Is this something that I can do with ADjoint?
Thanks!

Reply all
Reply to author
Forward
0 new messages