Adding L2 normalization layers for normalized feature readout

949 views
Skip to first unread message

Giedrius Tomas Burachas

unread,
Jan 8, 2016, 3:07:58 PM1/8/16
to Caffe Users
Hi, 

I'm have added normalization layers to caffenet's fully connected layers (fc6, fc7, fc8) for the sole purpose of reading out the L2-normalized values of the fully connected layer features. The output (top) of the normalization layers is not used in the network.  I keep getting   " Duplicate blobs produced by multiple sources." error. I tried to add Silence layers on top of the normalization, but that did not help.

Any pointers would be much appreciated!
GTB

Evan Shelhamer

unread,
Jan 8, 2016, 3:21:07 PM1/8/16
to Giedrius Tomas Burachas, Caffe Users
You likely need to distinctly name the tops of each of your new normalization layers. Except for in-place layers every top needs to have a unique name.

Evan Shelhamer





--
You received this message because you are subscribed to the Google Groups "Caffe Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to caffe-users...@googlegroups.com.
To post to this group, send email to caffe...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/caffe-users/44c675fb-d21d-4e6b-9cf6-85f805bb909a%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Giedrius Tomas Burachas

unread,
Jan 8, 2016, 3:43:13 PM1/8/16
to Evan Shelhamer, Caffe Users
thanks! The tops actually are unique..
GTB

Giedrius Tomas Burachas

unread,
Jan 8, 2016, 3:53:44 PM1/8/16
to Caffe Users, evan.sh...@gmail.com
Solved! In case someone wonders: I added LRN layers *before* ReLU. Moving them to after ReLU fixes the problem.

GTB

Evan Shelhamer

unread,
Jan 8, 2016, 5:25:17 PM1/8/16
to Giedrius Tomas Burachas, Caffe Users
Right -- if you double-check you'll see the ReLU have the same top name as fc{6,7} (and the bottom name is the top name) which designates the layers as in-place. Inserting the LRN layers between the FC and ReLU layers makes it not in-place, and so the tops are duplicated. I hope that's more clear.

Evan Shelhamer




Reply all
Reply to author
Forward
0 new messages