sharing parameters doesn't work with Siamese Network with pre-trained model

17 views
Skip to first unread message

Ananthachari KV

unread,
Jul 26, 2017, 5:59:19 PM7/26/17
to torch7
Hi,

I'm working on siamese network (assume each branch is an Alexnet).

I get error here: parameters, gradParameters = net:getParameters() if I clone from pre-trained model (imagenet pretrained alexnet).

I don't see this error if I create the model from scratch (basically no cloning).

I looked at this potential solution of using nn.Containers() instead of nn.ParallelTable() and yes, sharing parameters works but I don't know how to feed 2 inputs parallel then.

model = nn.Sequential():add(nn.SplitTable(2)):add(nn.ParallelTable()):add(siamese):add(nn.PairwiseDistance) --> cannot share parameters, throws error
or
model = nn.Sequential():add(nn.SplitTable(2)):add(nn.Container()):add(siamese):add(nn.PairwiseDistance) --> can share parameters, but throws error while feeding data

Kindly help,
Thanks


Ananthachari KV

unread,
Jul 26, 2017, 9:32:51 PM7/26/17
to torch7
For now, I read the params & gradParams out and then copied the weights. Kinda workaround. 

But if you guys have an elegant solution, it'd be helpful.

Thanks!
Reply all
Reply to author
Forward
0 new messages