1x1 convolution - why use it?

2,237 views
Skip to first unread message

Ran Manor

unread,
Feb 28, 2015, 8:41:01 AM2/28/15
to caffe...@googlegroups.com
Hi,

I'm looking at the GoogLeNet train proto, and I don't understand the reason for using 1x1 convolution.
Initially I thought that 1x1 convolutions allow you to specify a lower number of num_outputs to reduce dimensionality, but I see instances where the num_output is equal.
So why use it?
Doesn't it mean the net just pass a single weight across the input?

Thanks,
Ran

Anatoly Baksheev

unread,
Feb 28, 2015, 8:52:22 AM2/28/15
to Ran Manor, caffe...@googlegroups.com
Besides dimetional reduction, it's additional nonlinearity layer. Allows to combine channels from previous layer in a non-linear fashion, whch leads to more advanced features (in conttrast to simple linear filters).



-- Anatoly

--
You received this message because you are subscribed to the Google Groups "Caffe Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to caffe-users...@googlegroups.com.
To post to this group, send email to caffe...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/caffe-users/e0c59b09-9404-4798-ac90-37835627657c%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Ran Manor

unread,
Feb 28, 2015, 8:54:21 AM2/28/15
to Anatoly Baksheev, caffe...@googlegroups.com
I understand, thanks!

- Ran

npit

unread,
Apr 27, 2015, 10:51:35 AM4/27/15
to caffe...@googlegroups.com, anatoly....@itseez.com
By "1x1 coinvolution" we mean the filter size is the siize of a pixel? (because that would not lead to a reduced feature map than the input)

Gdrs

unread,
Jun 23, 2015, 9:35:05 AM6/23/15
to caffe...@googlegroups.com
npit, 

1x1 refers to "receptive field", and with 1x1 you don't get dimensionality reduction in space, but can get dim reduction in channel # if #inputs is larger than #outpouts. If #in = #out all you get is the ReLU nonlinearity.

npit

unread,
Jun 24, 2015, 4:23:22 AM6/24/15
to caffe...@googlegroups.com
Got it, thanks.
Reply all
Reply to author
Forward
0 new messages