add a binary mask to do element-wise multiplication on weights in the fully-connected layer

105 views
Skip to first unread message

Shawn Fang

unread,
Jan 30, 2017, 8:59:38 PM1/30/17
to Caffe Users
Hi, everyone!

I'm trying to add a binary mask on the weights in the fully-connected layer. The value of the mask is fixed and has nothing to do with training. Basically, the weight matrix will multiply the the mask elementally-wise before it multiply each neuron. Can anyone tell me how to write this in the caffe prototxt ? Or is there a way to solve this? 

Thanks,
Shawn

Przemek D

unread,
Jan 31, 2017, 2:30:26 AM1/31/17
to Caffe Users
There is an Eltwise layer that performs an element-wise operation of your choice (PROD (product), SUM or MAX) between its bottoms. You still need to load your mask somehow, perhaps as a data layer? Or a python layer?

Shawn Fang

unread,
Feb 3, 2017, 8:29:01 PM2/3/17
to Caffe Users
Hello, thanks for answering. Is there any tutorial on how to build a data/python layer ?
Reply all
Reply to author
Forward
0 new messages