Pruning Network

867 views
Skip to first unread message

林子達

unread,
Sep 3, 2015, 10:26:05 PM9/3/15
to Caffe Users
Hi all,
   According to  Learning both Weights and Connections for EfficientNeural Networks, they tried to reduce parameters to speed up inference. Has someone implemented on Caffe? 

Regards



Axel Angel

unread,
Sep 6, 2015, 12:45:30 PM9/6/15
to Caffe Users
Sorry for not answering your question directly but I've read here somewhere posts a few months ago on a similar topic that may be of interest. I cannot find this post but here are papers related which may be cited:
 - Data-free parameter pruning for Deep Neural Networks http://arxiv.org/abs/1507.06149
 - Pruning algorithms-a survey
 - Reshaping deep neural network for fast decoding by node-pruning http://ieeexplore.ieee.org/xpl/login.jsp?arnumber=6853595

And many many more.

林子達

unread,
Sep 6, 2015, 9:32:05 PM9/6/15
to Caffe Users
HI Axel,
  Thanks for your information. I will check these papers. I am wondering why parameter pruning features are not implemented to Caffe. 

Marcella Astrid

unread,
Dec 6, 2015, 1:14:59 PM12/6/15
to Caffe Users
I am still newbie in caffe and python. But, I also interest in this after reading deep compression paper. I want to try the pruning first. According to the paper, 
Pruning is implemented by adding a mask to the blobs to mask out the update of the pruned connections

But, I am not sure how to do this. Have you finally done it?
Is it using eltwise layer by product the pruned weight by 0? But I am still not sure how to do this. Any clue or reference will help.. i.e. how to product binary array {0,1,1,0,0} as mask to the weight.
Or modify the cpp inside caffe itself? 
I also find about this backward_gpu function in eltwise layer that has const vector< bool > &propagate_down parameter. But I still don't know how to use it in python with net surgery method. 

Ada

unread,
May 23, 2016, 10:57:48 AM5/23/16
to Caffe Users
Hi,

I am recently working on the same topic. Do you have any progress?If so, could you provide some tips or suggestions?

Thanks,
Ada
Reply all
Reply to author
Forward
0 new messages