My question is: Is there any way to install some of the layers in caffe? for example - say, only pool, convolution, inner product , but not absval_layer?
I am trying to compile a variant of caffe (Excitation backprop model
https://github.com/jimmie33/Caffe-ExcitationBP). The authors have implemented some new back propagation functions for some of the layers - for example pool, convolution, inner product etc. The include/caffe/util/device_alternate.hpp has been edited by the author to introduce 4 new Backward_<cpu/gpu> functions. This file seems to be accessed by all layers. As a result, when a `make all` command is executed, the layers, for which these functions are not implemented, throw errors. I tried to mention the function definition in the header files, but the implementation logic is different for each layer. Though i can try with dummy functions in each layer, the easiest option is to suppress all the other layers during installation.