How loss from split layer are combined for backpropagate error for former layers?

115 views
Skip to first unread message

Sk8er

unread,
Dec 20, 2015, 5:20:25 PM12/20/15
to Caffe Users
Hi

SplitLayer will automatically split the layers which servers as multiple bottom input of other layers.
What if the two layers splited by SplitLayer both connect to there own loss. How does the loss backpropagated be combined for former layers.

For example,

architecture A-B-C-D-loss1
                         \
                          B_split-E-F-loss2

How the loss from loss1 and loss2 be combined for computing error in A? Let's assume B is a convolution layer

Could  anyone show me where is the the code recognizes the two loss in B?


  

malte.oe...@gmail.com

unread,
Dec 21, 2015, 3:35:54 AM12/21/15
to Caffe Users
Hi Sk8er,

See Marc'Aurelio Ranzatos LSVR tutorial @ CVPR 2014, specifically slide 37. I hope this helps.

Regards
Malte

Zizhao Zhang

unread,
Dec 21, 2015, 7:56:34 PM12/21/15
to malte.oe...@gmail.com, Caffe Users
Hi Malte,

Thank for your reply. I understand the idea now. But I am curious about, in the code, where the errors from two branches are summed. Caffe seems will split the layers which server as multiple input, so that when do forwardbackword, it seems that computation of two branches are separated. When are they summed together?

Zizhao



--
You received this message because you are subscribed to a topic in the Google Groups "Caffe Users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/caffe-users/36wPYKLNClA/unsubscribe.
To unsubscribe from this group and all its topics, send an email to caffe-users...@googlegroups.com.
To post to this group, send email to caffe...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/caffe-users/ae138464-aca1-47d5-b979-99bac63924a1%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.



--
Best Regards,
Zizhao
Reply all
Reply to author
Forward
0 new messages