Have you read the AlexNet paper?
Figure 3 in that paper shows what the filter kernels actually learned, what you think does not happen in practice, a diverse set of kernels is needed to extract different features.
On Sunday, 31 July 2022 22:58:54 CEST Toposkovich wrote:
| Hello to everybody,
|
| I have one question, that - maybe... probably stupid - but I couldn't find
| any answer to it.
| My question is about the number of kernels in a convolution layer.
| After implementing many LeNet, VGG16, VGG19, etc. I realize that the number
| of kernels in the convolutional layers is *always > 3*.
|
| Let's say I would like to create a completely new CNN-network from scratch,
| then I would think *"Ok... since I'm going to process BGR-images (no gray
| pictures), I have for sure 3 different layers*". Then I would think "*Since
| I have 3 layers (one for Green, one for Blue and the last for Red) I need
| just 3 kernels. One kernel for layers*"
| Why I would think that? Because to me using e.g. 32 kernels in the same
| convolutional layer would be redundant, since the* 4th kernel* would have
--
You received this message because you are subscribed to the Google Groups "Keras-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to keras-users...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/keras-users/12f5f688-b98a-46c8-b843-ecb60d9be3c6n%40googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/keras-users/CAJHj7CdN5%3DqepND2%2BNz58HOX4N4Or2BbpBe1Uq91R%2BJLqxFeTQ%40mail.gmail.com.