Why does CNN's convolution kernel number increase layer by layer?

15 views
Skip to first unread message

caffer

unread,
Jul 5, 2016, 2:36:37 AM7/5/16
to Caffe Users

I find that no matter in lenet or vgg, the convolution kernel number is increasing layer by layer. And kernels in the shallow layers are much less than in the deep layers. Could anyone can give an explanation to this ?

And in my experiments, i find that the kernel number affects the backward values a lot. Is this a reason that the less kernel number can mitigate the dimish of gradient?

Reply all
Reply to author
Forward
0 new messages