Hi everyone,
my question concerns backward computation with the python wrapper.
If I do backward computation and look at the derivates, they are all zero.
What I've done so far:
- Load network: net = caffe.Classifier() (the imagenet reference model)
- perform forward computation on empty image: net.forward() (by default: input is taken from (empty) data layer). This gives my some probabilities as output.
- perform backward computation: net.backward()
- look at the derivatives with net.blob['layer_name'].diff. Unfortunately, every single derivative is empty (only zeros).
What I've tried to solve the problem:
- set force_backward: true (in deploy protofile)
- try to give the network a label during forward computation - how is the loss computed, when there is no label present? - by adding input: "label" to the protofile
Thanks in advance!
Rester Hall