Modify weights while manually training in python during every single backward iteration

115 views
Skip to first unread message

Belhassen Bayar

unread,
Feb 3, 2016, 12:15:05 PM2/3/16
to Caffe Users
I would like to train my network manually in python because I want to have a very specific template of the kernel weights at the very first convolutional layer.
More specifically, I want to impose the (i,j)^th value at my convolutional kernel in layer 1 to be always equal to a scalar called beta.
So I am wondering if it's possible to do that manually by net.forward() then net.backward() from the loss layer until the layer 1 then change the (i,j)^th entry to beta and finally feedforward() again and so forth until it converges..
If so how can I do that and is the learning rate gonna be updated as described in the solver.prototxt file?

I am also wondering if I can do net.backward(layen_n, layer_m) where it backpropagates the data from layer_n until layer_m.

Many thanks in advance!

 

Jan C Peters

unread,
Feb 4, 2016, 3:08:15 AM2/4/16
to Caffe Users
From a quick look at the code it seems to me that the caffe.Net class does not expose the update method to python (the one that actually subtracts the scaled gradients from the weights). Forward and Backward will not change the weights of the layers, only the data and diff parts of the blobs. So to do training from pycaffe you need to use the exposed solver classes. You could use the step(1) method to perform only one iteration, adjust the weights, do step(1) again and so on.

Jan
Reply all
Reply to author
Forward
0 new messages