'label' blob is changed by calls to net.forward()

149 views
Skip to first unread message

Tiferet Gazit

unread,
May 11, 2016, 3:54:22 AM5/11/16
to Caffe Users
I build a network (doesn't depend on the network architecture) with a standard  HDF5Data input layer creating 'data' and 'label' blobs. After I call net.forward(), the contents of the 'label' blob have changed, as seen by inspecting the contents of net.blobs['label'].data. This happens even though the only layers touching the 'label' are the standard HDF5Data input layer and the standard SoftmaxWithLoss loss layer (which is only supposed to read from it, not write to it).

Why is this happening? Also, how can I get around it to correctly read my labels from the network after it has run? I want to visualize predictions vs. ground truth labels, by doing something like this:

net = caffe.Net(test_net_filename, trained_network_model_filename, caffe.TEST)
net.forward()
last_layer = 'score'
slice_net = net.blobs['data'].data[0]
predictions = net.blobs[last_layer].data.argmax(1)
labels_net = net.blobs['label'].data[0]

Jan

unread,
May 11, 2016, 5:15:24 AM5/11/16
to Caffe Users
Maybe you don't correctly understand the net.forward() command, as this is completely expected behavior. Every call to net.forward causes _one_ call to forward to every layer in your network in sequence, i.e. usually exactly _one_ batch of data is fed through your network, the values in the top/bottom blobs are updated accrodingly. Calling forward on a data layer (every kind of data layer) causes that layer to read a batch from its data source and put it in its top blobs, so of course the data in there changes. but generally your code snippet should work. I don't really see the problem. maybe you can clarify.

Jan

Tiferet Gazit

unread,
May 11, 2016, 5:36:55 AM5/11/16
to Caffe Users
Thank you for your response! However, what I am seeing is that the 'label' blob now contains values different from the labels it reads in from the HD, and different from the values the input layer sees in its forward function and fills into top[1].data[...]. So the 'label' blob is being changed at some point after it is fed its data in the forward function of the input layer. In fact, during the reshape function of the loss layer the values are still correct, and my loss is computed well, but when I try to read them after the forward() command completes with the call to net.blobs['label'].data[0] I see altered values. For example, the labels might be integers in the range [0,6] throughout the code, but then when I call net.blobs['label'].data[0] I get back floats smaller than 1.

Jan

unread,
May 12, 2016, 3:41:07 AM5/12/16
to Caffe Users
Well, that shouldn't happen, but I have no idea what is going wrong there. Maybe you can post your network.prototxt (that you use with pycaffe)?

Jan

Tiferet Gazit

unread,
May 16, 2016, 7:41:49 AM5/16/16
to Caffe Users
I've noticed this happening with all different kinds of networks I've used, so I don't think it's related to the network architecture. Attached is one example, but it's happened before even with far simpler networks.

Thank you for your help!
network.prototxt
Reply all
Reply to author
Forward
0 new messages