How should I directly output the weight parameters of the fully connected layer?

17 views
Skip to first unread message

wal...@gmail.com

unread,
Jul 25, 2018, 10:59:25 PM7/25/18
to Caffe Users
My network has two fully connected layers. I want to compare the difference between the weights of these two layers for the calculation of the loss. How should I get the weight directly in the forward process instead of the value of the data? 

I have seen some tutorials that can use net.param to get the parameters of a layer, but I don't know how to get the value of these parameters in the forward process and pass it as parameters to the Python Layer which calculates the loss?

Przemek D

unread,
Aug 22, 2018, 11:12:57 AM8/22/18
to Caffe Users
Normally Caffe layers contain parameter blobs and output data blobs - never otherwise. But I think this might be possible using the Parameter layer. For each of these InnerProduct layers define a corresponding Parameter layer that would have an internal blob of the exact same dimension as the InnerProduct. Then make these layers share parameters with each other (that is, InnerProduct shares with its Parameter). In theory, this would let you get access to the InnerProduct's parameter blob as the output of the corresponding Parameter layer. Then you could Eltwise them or do whatever you want to compute some loss. I haven't tested it though, it just came to my mind. If you decide to try it out, be sure to let me know if it worked ;)
Reply all
Reply to author
Forward
0 new messages