Groups keyboard shortcuts have been updated
Dismiss
See shortcuts

How to share the parameters of dropout layer in Siamese?

164 views
Skip to first unread message

Viveka Kulharia

unread,
Mar 6, 2016, 4:55:17 PM3/6/16
to Caffe Users
Hi, I am using the following to share dropout parameters in Siamese. But it gives the error: "Too many params specified for layer drop3_p ...". Is it the case that dropout layer parameters need not be changed? Thanks.

layer {
  name: "drop3_p"
  type: "Dropout"
  bottom: "fc3-conv_p"
  top: "fc3-conv_p"
  param { 
    name: "share_drop3"
  }
  dropout_param {
    dropout_ratio: 0.5
  }
}

Jan

unread,
Mar 7, 2016, 2:57:43 AM3/7/16
to Caffe Users
To the best of my knowledge, a dropout layer does not have any trainable parameters, so there is nothing to be shared. The only parameter of a dropout layer is the dropout_ratio, but that is fixed and not altered by training.

Jan
Reply all
Reply to author
Forward
0 new messages