You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to Caffe Users
Hi, I am using the following to share dropout parameters in Siamese. But it gives the error: "Too many params specified for layer drop3_p ...". Is it the case that dropout layer parameters need not be changed? Thanks.
layer {
name: "drop3_p"
type: "Dropout"
bottom: "fc3-conv_p"
top: "fc3-conv_p"
param {
name: "share_drop3"
}
dropout_param {
dropout_ratio: 0.5
}
}
Jan
unread,
Mar 7, 2016, 2:57:43 AM3/7/16
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to Caffe Users
To the best of my knowledge, a dropout layer does not have any trainable parameters, so there is nothing to be shared. The only parameter of a dropout layer is the dropout_ratio, but that is fixed and not altered by training.