How are momentum variables handled if one uses multiple Update functions?

9 views
Skip to first unread message

arturs...@googlemail.com

unread,
Apr 4, 2018, 9:01:32 AM4/4/18
to lasagne-users
Hey,

lets assume I have two different loss functions (for example one supervised, one unsupervised) that I use to train the same set of parameters.

To do this I defined two update functions which both use Adam and which I call alternatively.
I am wondering how the momentum variables of the optimizer are handled in such a case.
Are they shared, or does each function maintain a separate set momentum variables?
I guess they are separated, in this case how could I change this behaviour so that they are shared?

So I want that unsupervised update uses the parameter update from the previously executed supervised update to calculate the moment estimates for Adam.
Reply all
Reply to author
Forward
0 new messages