optimizers is deprecated, use sgd, adam etc. instead.

8 views
Skip to first unread message

James Clements

unread,
Apr 27, 2023, 10:26:03 AM4/27/23
to knet-users
I'm running the https://github.com/ekinakyurek/GAN-70-Lines-of-Julia code and getting the subject warning.

What is the new way to specify an Adam optimizer?

The offending line is 

    𝚶 = (g=optimizers(𝗪.g,Adam;lr=0.0002),d=optimizers(𝗪.d,Adam;lr=0.0002))

feeding

    runmodel(𝗪,dtst,𝞗; optim=𝚶, train=false)

which feed these lines...

    train ? update!(𝗪.d, ∇d(𝗪.d,x,Gz), optim.d) : (dloss += 2B*𝑱d(𝗪.d,x,Gz))
    z = 𝒩(𝞗[:ginp],2B) # Sample z from Noise
    train ? update!(𝗪[1], ∇g(𝗪.g,𝗪.d,z), optim.g) : (gloss += 2B*𝑱g(𝗪.g,𝗪.d,z))

JC
Reply all
Reply to author
Forward
0 new messages