what's the difference between train mode and evalutation mode

23 views
Skip to first unread message

Mata Fu

unread,
Jan 18, 2017, 4:37:22 AM1/18/17
to torch7
As the net mentioned https://github.com/amitshaked/resmatch/blob/master/src/networks/network.lua convert the fc layer to conv layer in test net, after changing the net, it sets
evaluate()

in the end, what's this for ? 


function network.getTestNetwork(model)
   
-- Replace the model with fully-convolutional network
   
-- with the same weights, and pad it to maintain resolution


   
local testModel = model:clone('weight', 'bias')


   
-- replace linear with 1X1 conv
   
local nodes, containers = testModel:findModules('nn.Linear')
   
for i = 1, #nodes do
     
for j = 1, #(containers[i].modules) do
         
if containers[i].modules[j] == nodes[i] then


           
local w = nodes[i].weight
           
local b = nodes[i].bias
           
local conv = nn.SpatialConvolution1_fw(w:size(2), w:size(1)):cuda()
            conv
.weight:copy(w)
            conv
.bias:copy(b)
           
-- Replace with a new instance
            containers
[i].modules[j] = conv
         
end
     
end
   
end


   
-- replace reshape with concatenation
   nodes
, containers = testModel:findModules('nn.Reshape')
   
for i = 1, #nodes do
     
for j = 1, #(containers[i].modules) do
         
if containers[i].modules[j] == nodes[i] then
           
-- Replace with a new instance
            containers
[i].modules[j] = nn.Concatenation():cuda()
         
end
     
end
   
end


   
-- pad convolutions
   padConvs
(testModel)


   
-- switch to evalutation mode
   testModel
:evaluate()




   
return testModel
end

Vislab

unread,
Jan 18, 2017, 5:28:13 AM1/18/17
to torch7
Layers like nn.Dropout() or nn.BatchNormalization() have different behaviours during train/test (nn.Dropout() during test is essentially a nn.Identity() module). Then most practical way to switch all those layers at once in a sequence of layers is to have a training and evaluation methods to trigger the right behaviour.
Reply all
Reply to author
Forward
0 new messages