Is there a way to simply call a backpropagation on a model with gradients as input?

658 views
Skip to first unread message

jussm...@gmail.com

unread,
Jun 1, 2016, 4:12:11 PM6/1/16
to Keras-users
Hi, new to Keras here. I'm moving from a custom neural net API to (hopefully) Keras so that I can reduce my prototyping time and reduce my work load in general. Though, for my specific uses, I'm having some trouble with the transition.

This is possibly a weird question, since what I'm doing is non-standard.

What I need is (In pseudo-code):

Assume I have defined two models: model1 and model2

Key:
Green: I can do this
Red: This is what I need

-------------------PSEUDO-CODE BELOW--------------------
begin batch loop:
   input1, input2, target = get_input_batch_data() # Mine - this returns two different inputs for two disjoint neural networks and a set of target data

   output1 = model1.feed_forward_only(input1) # Keras ... This isn't the problem because I can use model.predict_on_batch() for this.
   output2 = model2.feed_forward_only(input2) # Keras again

   predicted = custom_forward_mapping_function(output1,output2) # Mine - don't worry about what this does - just know that my actual loss is based on two different outputs from two distinct networks

   total_grads = loss (predicted,target) # Mine - this isn't a problem as I can do this myself easily

   grads1, grads2 = custom_backward_mapping_function(total_grads) # Mine - again, don't worry what this does


   model1.backprop_and_update(grads1) # I need a function that does this
   model2.backprop_and_update(grads2) # ^ ditto


end batch loop:
-------------------------------------------------------------------------

So, point is, I need a function that simply allows me to do back propagation of a model given gradients. I couldn't find anything in the documentation about this. All training functions (many/single batch) seem to be a single call that does feed forward, loss and backprop/update all at once. This won't work for me.

Thanks!
Justin

mridul birla

unread,
Feb 24, 2017, 3:55:46 AM2/24/17
to Keras-users, jussm...@gmail.com
Hey,

Did you find the solution?

bir...@gmail.com

unread,
Feb 13, 2018, 4:53:24 AM2/13/18
to Keras-users
Yes, this is desperately needed

Daπid

unread,
Feb 13, 2018, 11:17:08 AM2/13/18
to bir...@gmail.com, Keras-users
As far as I understand the problem, that is what the optimiser does through its updates parameter.

https://github.com/keras-team/keras/blob/596cca7d5aa356d9315eb4458b0adc02800b8632/keras/optimizers.py#L167-L193


/David.

--
You received this message because you are subscribed to the Google Groups "Keras-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to keras-users+unsubscribe@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/keras-users/081b30f7-5d4b-485f-9fd6-f67eb82d87f8%40googlegroups.com.

For more options, visit https://groups.google.com/d/optout.

Reply all
Reply to author
Forward
0 new messages