How nn.SoftMax works

17 views
Skip to first unread message

Derk Mus

unread,
Sep 29, 2016, 5:24:21 AM9/29/16
to torch7
I have made this simple script to reproduce the issue

require 'nn'

local model = nn.Sequential()

model
:add(nn.Identity()) -- Your model that outputs seqlen x batchsize x featsize

local c1 = nn.Sequential():add(nn.Narrow(3,1,1)):add(nn.Sigmoid())
local c2 = nn.Sequential():add(nn.Narrow(3,2,2)):add(nn.SoftMax())

local concat = nn.ConcatTable():add(c1):add(c2) --This will apply the 21-output to both C1 and C2
model
:add(concat)

local t = torch.randn(1,1,3)


local out = model:forward(t)
print('input')
print(t)
print('output')

print(out[2])

This gives as output for out[2] two ones, while I expect 2 values that sum up to one because of the SoftMax. What is going wrong here? 

Derk

unread,
Sep 29, 2016, 5:35:19 AM9/29/16
to torch7
Okay, so it seems like the softmax is applied along the first dimension, where my input is sequence length x batchsize x feature length. So I want to apply the softmax along the third dimension, is this possible?

Op donderdag 29 september 2016 11:24:21 UTC+2 schreef Derk Mus:
Reply all
Reply to author
Forward
0 new messages