Hello all,
I am studying caffe. I found that in some layers, such as convolution layer and inner product layer, we need to set param "num_output".
I only understand that the last "num_output" should be assigned to the number of classes we need. But in other layers, I have no idea where those num_output values come from.
For example, in AlexNet, as writen in
https://github.com/BVLC/caffe/blob/master/models/bvlc_alexnet/train_val.prototxt, in "conv1" layer, num_output is 96; in "conv2" and "conv5", num_output is 256; in "conv3" and "conv4", num_output is 384; in "fc6" and "fc7", num_output is 4096.
Could anyone help explain why? Thank you!