Custom activation/non-linearity functions

7 views
Skip to first unread message

Ross Andrew Donnachie

unread,
May 22, 2020, 4:37:48 AM5/22/20
to knet-users
Good day, 

I have run into issues (rather nonsensical ones at first, and then illuminating ones) when trying to supply a custom activation function.

A search through the source code leads me to actf.jl, and the related documentation page hints at the provision of gradients. I see that in actf.jl, reluforw is the relu function in essence, while reluback is the documented gradient. How do these definitions get linked to the `relu` symbol? How would I imitate this to create a 'smooth' function with correct use of defined smoothforw/back functions?

I am hoping to get a headstart on figuring out the provision of a custom activation function. If I have to forgo @gpu (because I see that there are custom mode flags for each activation function) that is fine. Are the GPU activation functions actually defined in the GPU/CUDNN julia
packages?

Thanks in advance for any time/consideration!

Ross

Deniz Yuret

unread,
May 22, 2020, 11:52:28 PM5/22/20
to Ross Andrew Donnachie, knet-users
Hi,

actf.jl has been deprecated, are you sure you are using the latest Knet version?

The current definitions should be in src/unary.jl.

To define a new primitive please see `@doc AutoGrad.@primitive`.

best,
deniz


--
You received this message because you are subscribed to the Google Groups "knet-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to knet-users+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/knet-users/9c62263f-fa36-4ded-aae6-8396df1a5db2%40googlegroups.com.
Reply all
Reply to author
Forward
0 new messages