Hi All,
I've got a model that I can optimize manually, like this:
function train()
for r = 1,numRuns do
model:zeroGradParameters()
local pred = model:forward(inputs)
local err = crit:forward(pred, outputs)
local grad = crit:backward(pred, outputs)
model:backward(inputs, grad)
model:updateParameters(0.01)
end
end
But when I try to use optim, like this,
function optTrain()
params, paramGrads = model:getParameters()
local feval = function(x)
if x ~= params then params:copy(x) end
paramGrads:zero()
local pred = model:forward(inputs)
local err = crit:forward(pred, outputs)
local grad = crit:backward(pred, outputs)
model:backward(inputs, grad)
return err, paramGrads
end
optimMethod(feval, params, optimState)
end
I get this error:
inconsistent tensor size at /torch/pkg/torch/lib/TH/generic/THTensorMath.c:424
Any ideas for what I should fix? Do I need to resize the param vector for optim somehow?
Thanks!