OK I did it - but this is the SH*TT*EST piece of coding I've done this year - consolidation is it works - and it looks like the calculation I'm trying to reproduce :)))))
Pub lunch for anyone who can improve it
-- generate kernel
local gauss = torch.Tensor(N, C):zero()
for k=1,N do
for c=1,C do
gauss[k][c] = math.exp(-(math.pow( (c-mu[k]), 2) )/(2*sigma_sqrd))
end
end
-- normalize -- find a better way of doing this !!!!
Z = torch.sum(gauss,2)
one_over_Z = torch.cdiv(torch.ones(N) , Z)
out = torch.Tensor(N, C):zero()
for k=1,N do
out[k] = torch.mul(gauss[k] , one_over_Z[k])
end
--check row sums
-- torch.sum(out,2)
return out
On Sunday, March 8, 2015 at 2:41:55 AM UTC, smth chntla wrote:
destructively normalize matrix m in-place:
m:add(m:mean(2):mul(-1):expandAs(m))
I can give you other cases as well if you want.