How goes keras calculate gradients for custom loss functions?
3,412 views
Skip to first unread message
Nick Frosst
unread,
Nov 15, 2015, 1:20:12 PM11/15/15
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to Keras-users
So i see that keras can use custom loss functions by simply pacing a function with model.compile(loss=_your_loss_, optimizer=sgd) but how does it optimize this? It doesn't take a derivative version of the function so how does it get the derivative? thanks :)
cheers _nick
Klemen Grm
unread,
Nov 16, 2015, 2:49:55 AM11/16/15
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to Keras-users
If you look at the source file for builtin objective functions ( https://github.com/fchollet/keras/blob/master/keras/objectives.py ), notice they're all implemented as Theano functions, which enables automatic gradient calculation. This must also be the case for any custom objective function you implement yourself.