Unable to do forward pass with updated internal layer data in pycaffe

282 views
Skip to first unread message

Yi Liu

unread,
Jul 9, 2015, 2:52:27 PM7/9/15
to caffe...@googlegroups.com
I am trying to do a conv-net forward pass after manually setting the data blobs in conv5 in pycaffe.

In the demo below, I loaded VGG16, and tried to update the pycaffe Net object's internal blobs with random values before trying to run a forward pass. When conv5 flag is set to True, repeated execution does not yield different output when I print fc8_forward. In contrast, the else block runs as expected.

Any advice on this matter is greatly appreciated.
Also, I would also like to hear advice regarding whether I should be copying my numpy arrays before using them in the C++ bindings.

Thank you for your help in advance!


```
import caffe
import numpy as np
caffe_root = '/somewhere/caffe'
caffe.set_mode_gpu()
net = caffe.Net(caffe_root + 'models/bvlc_reference_caffenet/deploy_hallucination.prototxt',
                caffe_root + 'models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel',
                caffe.TEST)

conv5 = True
if conv5: # doesn't work -- i.e. fc8_forward does not change on repeated execution of this code block
    x = np.random.rand(1,256,13,13)*255
    net.blobs['conv5'].data[...] = x.copy()
    curr_forward = net.forward(start = 'pool5',end='fc8')
    # also tried start = 'conv5'. pool5 is right after conv5 -- same behavior
else: # this works as expected
    y= np.random.rand(1,3,227,227)*255
    net.blobs['data'].data[...] = y.copy()
    curr_forward = net.forward(start = 'conv1',end='fc8')

fc8_forward = np.array(curr_forward['fc8'].flat)
print fc8_forward
```

The layers in VGG 16 are:
['data', 'conv1', 'pool1', 'norm1', 'conv2', 'pool2', 'norm2', 'conv3', 'conv4', 'conv5', 'pool5', 'fc6', 'fc7', 'fc8', 'prob']

muhammad arif nasution

unread,
Nov 27, 2016, 9:58:25 AM11/27/16
to Caffe Users
any update for this thread ?

Hunter Elliott

unread,
Dec 3, 2016, 3:38:38 PM12/3/16
to Caffe Users
I was able to get this to work by setting the start layer in my call to net.forward to the layer immediately AFTER the one I altered the activations in (in your case the layer after fc8 whatever that might be).

Also, if you are using in-place nonlinearity layers the end= syntax did not work for me. I had to either omit the end argument and let it finish the forward pass through the entire network, or change my deploy prototxt so that the last nonlinearity was not "in-place" (top different from bottom).

Harshana Habaragamuwa

unread,
Jul 13, 2017, 9:50:28 PM7/13/17
to Caffe Users
Yes this worked for me also in my case it was from ReLU to fully connected 
Reply all
Reply to author
Forward
0 new messages