Missing backward function call for SoftmaxWithLoss

43 views
Skip to first unread message

ChungYu

unread,
Mar 16, 2015, 5:23:56 AM3/16/15
to caffe...@googlegroups.com
Hi everyone, 

I want to know how the loss/cost propagates in caffe, 
and I print a line in files of inner_product_layer.*, softmax_*, to show the function call during fine-tuning.

I run caffe fine-tuning in GPU mode, and the result shows something like:


inner product FW@GPU
inner product FW@GPU
inner product FW@GPU
SoftmaxWithLossLayer FW@GPU 
SoftmaxLayer FW@GPU
SoftmaxWithLossLayer BW@GPU
inner product BW@GPU
inner product BW@GPU
inner product BW@GPU
inner product FW@GPU
inner product FW@GPU
inner product FW@GPU
...

Since the GPU implementation for SoftmaxWithLossLayer hasn't done yet, 
the forward step at SoftmaxWithLossLayer at GPU will go to find the function in SoftmaxLayer at GPU instead.

However, when calling backward in SoftmaxWithLossLayer at GPU, which (should) try to call SoftmaxLayer Backward atGPU, 
there is no such function call as it should be.

Could anyone tell me whether the backward go?

Thanks for answering!
Reply all
Reply to author
Forward
0 new messages