Effect of a higher num_output in fc layer

25 views
Skip to first unread message

spaceman_spiff

unread,
Jan 6, 2016, 2:05:12 PM1/6/16
to Caffe Users
Hi, 

Say I am fine-tuning the available AlexNet/GoogleNet model for a fresh set of images. The num_output for the fully-connected layer in the available train_val.prototxt is ~1000.
But my new classification set has only 10 classes (as of now, may go up to say ~25 in future). Will it hurt the ability of the net to learn if I let the value remain 1000? 
(From the perspective that the other 990 classes are not learned, so does it affect the accuracy/ability-to-learn/something-else in any way). If so, what are some other possible issues that could come up due to this?

Alex Orloff

unread,
Jan 6, 2016, 9:08:09 PM1/6/16
to Caffe Users
Hi,
In your fine-tune prototext you always may delete last layer with new one, containing 10 or 25 outputs only

среда, 6 января 2016 г., 22:05:12 UTC+3 пользователь spaceman_spiff написал:

spaceman_spiff

unread,
Jan 6, 2016, 10:04:33 PM1/6/16
to Caffe Users
Hi Alex,

Thanks for the reply. But my question was whether the net's ability to learn/classify be affected if I keep the num_outputs at a higher value. Since I don't have examples of those classes the net will simply not learn them, right? Asking about this, as I preferably wouldn't want to change the layer/protobuf when adding a new class to the existing 10. This is not a show-stopper or anything for my experiments. Just asking out of curiosity :)
Reply all
Reply to author
Forward
0 new messages