For the modules you want to fix the layers, you can override the accGradParameters function with a void function.
For example, say you have a model like
model = nn.Sequential():add(nn.Linear(2,2)):add(nn.Linear(2,2))you can fix the first layer by doing
model:get(1).accGradParameters = function() endNow, if you want to have different learning rates per layer, look at the second post in
https://gist.github.com/szagoruyko/1e994e713fce4a41773e