Losses do not sum to the total loss?

11 views
Skip to first unread message

Alex Ter-Sarkisov

unread,
Mar 13, 2018, 1:46:59 PM3/13/18
to Caffe Users
This is the output during the training (printed out every 100 iterations):

I0313 17:42:52.685120 12629 solver.cpp:218] Iteration 900 (0.781886 iter/s, 127.896s/100 iters), loss = 102373
I0313 17:42:52.685169 12629 solver.cpp:237]     Train net output #0: lossBlobsBadTwoCows = 2072.89 (* 1 = 2072.89 loss)
I0313 17:42:52.685179 12629 solver.cpp:237]     Train net output #1: lossBlobsBadTwoPredicts = 9165.83 (* 1 = 9165.83 loss)
I0313 17:42:52.685192 12629 solver.cpp:237]     Train net output #2: lossCombinedSigmoid = 15230.3 (* 1 = 15230.3 loss)
I0313 17:42:52.685205 12629 solver.cpp:237]     Train net output #3: lossGoodSigmoid = 72461.5 (* 1 = 72461.5 loss)
I0313 17:42:52.685215 12629 solver.cpp:237]     Train net output #4: lossNumBlobsBad = 0.11664 (* 1 = 0.11664 loss)
I0313 17:42:52.685225 12629 solver.cpp:237]     Train net output #5: lossNumBlobsBadTwoCows = 1.99832 (* 1 = 1.99832 loss)
I0313 17:42:52.685233 12629 solver.cpp:237]     Train net output #6: lossNumBlobsBadTwoPredicts = 0.442521 (* 1 = 0.442521 loss)
I0313 17:42:52.685243 12629 solver.cpp:237]     Train net output #7: lossNumBlobsGood = 0.0531871 (* 1 = 0.0531871 loss)


Why don't the losses (Train output #0-#7) do not sum to the total loss of 102373? There are no other loss layers. Is this something to do with the batch size? 
Reply all
Reply to author
Forward
0 new messages