Shared weights

32 views
Skip to first unread message

Tim Meywerk

unread,
Nov 2, 2017, 9:51:44 AM11/2/17
to Caffe Users
Hi,
how would I go about sharing weights inside a layer? I know there are special layers that share weights like the convolutional layers. My envisioned net however does not use any of these predefined layers as far as I know.
Consider this example: I have one 1-dimensional bottom blob of size n and an top blob of the same shape and size. I'd like to share weights in such a way, that all edges between bottom[i] and top[i] use the same weight for all i. Same for the edge between bottom[i] and top[(i + 1) % n] and so on. Is this possible given the predefined layers or would i need to implement my own layer? In the second case, how would i ideally go about this?
Best regards,
Tim

Tim Meywerk

unread,
Nov 2, 2017, 1:12:03 PM11/2/17
to Caffe Users
I just realized my example above is actually a convolutional net. So lets instead consider a more arbitrary example: We still have 1-dimensional top and bottom blobs of size n. I now want to have only 4 different weights: For bot[i] and top [j] we will use w1 iff both i and j are prime, w2 iff i is prime and j is not, w3 iff j is prime and i is not and w4 if neither are prime.
Reply all
Reply to author
Forward
0 new messages