Finetuning a deep fully convolutional neural network with skip connections

85 views
Skip to first unread message

Jonathan Balloch

unread,
Sep 3, 2017, 9:24:44 PM9/3/17
to Caffe Users
What is the proper procedure for finetuning a deep fully convolutional neural network with skip connections for transfering between different tasks? For a 'typical' (no skip connections, not fully convolutional) neural network, at least the last layer is "fully connected", which is to say the nodes in this layer use every node in the previous layer as input, and the number of outputs is equal to the number of class labels *M*. In [Shelhamer's original FCN paper][1] and the follow-up PAMI paper, there are details about how to implement this type of finetuning (also sometimes referred to as feature extraction); however, there aren't any details as to how it should be done with a network with skip connections like a [ResNet][2] or a [DenseNet][3]. Does anyone have any advice on best practice here?

(x-posted to Reddit and StackOverflow)


Reply all
Reply to author
Forward
0 new messages