You want to only update specific elements of the parameter blob, correct? For example, only some slices of only some convolution kernel?
If so, I'm afraid you'd have to do this manually. The easiest hack would be to carry out the whole training in Python manually (i.e. forward and backward pass and then apply gradients using the newly added
apply_update method), but store the values you want to remain constant and manually restore them after every parameter update. Does this have a specific use case? This general strategy seems pretty inefficient, but maybe we could work around this for your particular problem?