How to interpret fc7 activations in a Fully Convolutional Network (AlexNet)?

264 views
Skip to first unread message

Kaj-Robin Weslien

unread,
Feb 5, 2016, 4:09:34 PM2/5/16
to Caffe Users
Hi,

I have a trained AlexNet where I have done the "net surgery" to replace fully connected layers by convolutional ones.

My input is a 2048x2048 image and fc8-conv layer is my output which gives me about what I expect; Dimensions  [1, 5, 61, 61] when using 5 classes. This gives me a 61x61 "heat map" for each class. Why I don't get 64 by 64 is a mystery to me but fair enough.

When looking into layer fc7-conv, which has 4096 outputs, I would have expected dimensions [1, 4096, 64, 64] or similar but instead Python shows me [4096, 4096, 1, 1]. I have tried reshaping this blob into arrays containing a 4096 "feature vector" for each point in a 64x64 image, but I don't see how data is organized.

Any clues would be helpful.

When I run images through the standard AlexNet, I can see that the 4096 "feature vector" in fc7 makes sense, and I can see similar objects get similar values here. But no matter how I permute the data in fc7-conv the numbers just look random to me.

Evan Shelhamer

unread,
Apr 14, 2016, 4:51:45 AM4/14/16
to Kaj-Robin Weslien, Caffe Users
Hi,

The `net.blobs['fc7-conv'].data` array should likewise have spatial dimensions like the `fc8-conv` layer does. Can you post code that reproduces the issue?

In normal usage every layer of an FCN will have batch, channel, and spatial dimensions with the usual interpretations.

Evan Shelhamer





--
You received this message because you are subscribed to the Google Groups "Caffe Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to caffe-users...@googlegroups.com.
To post to this group, send email to caffe...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/caffe-users/a6abf76d-09f2-461e-8165-89c43695eeb9%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply all
Reply to author
Forward
0 new messages