How can I tell how much a given feature contributes to the final classification accuracy?

31 views
Skip to first unread message

Tiferet Gazit

unread,
Jan 13, 2016, 8:50:35 AM1/13/16
to Caffe Users
I have a few features that are more expensive to compute than most, and I would like to see whether the network actually ends up relying heavily on them or whether I can leave them out, so as to reduce test time for each new example. I can try training and testing with and without each of these features in turn to see how heavily each affects the prediction accuracy, but I was wondering whether there is an easier way. Is there some way to visualize the "weight" a certain input has in the trained network, or its contribution to the final classification?

xwzh...@gmail.com

unread,
Oct 13, 2016, 8:26:22 PM10/13/16
to Caffe Users
How was your work going? Currently, I am working on a program which is similar to yours. Would you mind share your ideas?
Reply all
Reply to author
Forward
0 new messages