Hello
I just recently started off with Caffe I've trained my first own networks and now I want to play around more deeply with it.
I just was wondering what is the easiest method of the C++ equivalent in python of getting the backward gradients of the whole neural network. I've seen a lot of Python examples on the net using following code:
# Do backpropagation to calculate the gradient for that outcome
gradient = net.backward(prob=probs)
Where gradient I guess is a Blob.
I am not a big Python code and I prefer to use C++ with which I am far more familiar with. however I checked the C++ Api documentation.
But I realized that the Net::backward method is a void function so it doesn't return a Blob<DataType>.
I guess I can probably change the probs as in the python code above if I want by changing the Net::output_blobs() via the Blob<DataType>::set_cpu_data. and then call (do I need to call Blob<DataType>::Update afterwards?)
and then call Net::backwards afterwards
But how do I get the equivalent gradients as in this python call?
I checked the source code of the python wrapper but this didn't lead to much either as it seems to just call the C++ function which is weird, same for the matlab/mex wrapper so I wonder how this actually works at all?
Do I have to call Net::bottom_vecs() ?
Any hints would be greatily appreciated thank you!