Using softmax with loss but want to calculate Euclidean loss as well.

302 views
Skip to first unread message

Atena Nguyen

unread,
Jul 18, 2017, 8:43:29 AM7/18/17
to Caffe Users
Hi all, 

I have a question regarding loss function. 

Currently, I use softmax with Loss as my loss function for training. However, my problem requires calculating the L2 distance between predicted value and the labels. 
Is there any way to calculate this? 


Best regards, 
Atena 

Jonathan R. Williford

unread,
Jul 18, 2017, 8:55:17 AM7/18/17
to Caffe Users
You can use the Hinge layer:


Some of the documentation is hidden in the Doxygen generated documentation:

Best,
Jonathan

--
You received this message because you are subscribed to the Google Groups "Caffe Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to caffe-users+unsubscribe@googlegroups.com.
To post to this group, send email to caffe...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/caffe-users/7794c373-6d0d-4c6e-8df4-1130d121ceef%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Atena Nguyen

unread,
Jul 18, 2017, 11:07:48 AM7/18/17
to Caffe Users
Thank you for your reply. 

I use softmaxwithloss for my problem, but I also want to see (show, or analysis) the L2, or L1 distance loss (as you mention Hinge layer). 

Do you have any suggestion? 

Vào 21:55:17 UTC+9 Thứ Ba, ngày 18 tháng 7 năm 2017, Jonathan R. Williford đã viết:
You can use the Hinge layer:


Some of the documentation is hidden in the Doxygen generated documentation:

Best,
Jonathan
On Tue, Jul 18, 2017 at 2:43 PM, Atena Nguyen <ngct...@gmail.com> wrote:
Hi all, 

I have a question regarding loss function. 

Currently, I use softmax with Loss as my loss function for training. However, my problem requires calculating the L2 distance between predicted value and the labels. 
Is there any way to calculate this? 


Best regards, 
Atena 

--
You received this message because you are subscribed to the Google Groups "Caffe Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to caffe-users...@googlegroups.com.

Przemek D

unread,
Jul 19, 2017, 3:04:27 AM7/19/17
to Caffe Users
If you leave SoftmaxWithLoss as it is and add Hinge as Jonathan suggests but set loss_weight: 0 on it, it will still calculate loss (and it will show in the logs) but before propagating down it will be multiplied by 0 (that is, it will not propagate).

Atena Nguyen

unread,
Jul 19, 2017, 5:11:10 AM7/19/17
to Caffe Users
Will you specify example on multiple loss implementations like hinge loss and softmax with loss?
I know what do you mean but do not know how to implement it, 


Vào 16:04:27 UTC+9 Thứ Tư, ngày 19 tháng 7 năm 2017, Przemek D đã viết:

Atena Nguyen

unread,
Jul 19, 2017, 5:17:27 AM7/19/17
to Caffe Users

Some thing like this ?

layer {
  name: "softmax"
  type: "Softmax"
  bottom: "score"
  top: "softmax"
}
layer {
  name: "hingleloss"
  type: "HingleLoss"
  bottom: "softmax"
  bottom: "label"
  top: "hingleloss"
  loss_weight: 0
}
layer {
  name: "loss"
  type: "SoftmaxWithLoss"
  bottom: "score"
  bottom: "label"
  top: "loss"
  loss_param {
    normalize: false
  }
}


Jonathan R. Williford

unread,
Jul 19, 2017, 5:35:48 AM7/19/17
to Atena Nguyen, Caffe Users
Yes. Thanks for writing out the example. If you are using the command line (instead of PyCaffe, for example), as long as the blob "hingleloss" is not fed into another layer, the result will be printed out. Of course, if you are using PyCaffe, you can just look at the blob values directly.

Did you have any issues with it?

Cheers,
Jonathan

To unsubscribe from this group and stop receiving emails from it, send an email to caffe-users+unsubscribe@googlegroups.com.

To post to this group, send email to caffe...@googlegroups.com.

Atena Nguyen

unread,
Jul 19, 2017, 5:41:02 AM7/19/17
to Caffe Users, ngct...@gmail.com
I am using pycaffe and using python to generate the prototxt file. However, I do encounter the problem with this structure. 
I could not train the network and got the error

I0719 18:37:12.834614  9724 net.cpp:137] Memory required for data: 35389440
I0719 18:37:12.83*** Check failure stack trace: ***

I still did not get what happens yet. 

Vào 18:35:48 UTC+9 Thứ Tư, ngày 19 tháng 7 năm 2017, Jonathan R. Williford đã viết:

Jonathan R. Williford

unread,
Jul 19, 2017, 9:58:07 AM7/19/17
to Atena Nguyen, Caffe Users
Sorry, I missed something in your prototxt, the first bottom of the hingeloss should be "score" (the values should vary from -infinity to infinity, see doxygen docs). Why is it named "HingleLoss" and not "HingeLoss"? Not sure how Caffe didn't throw an error for that.

Jonathan

--
You received this message because you are subscribed to the Google Groups "Caffe Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to caffe-users+unsubscribe@googlegroups.com.
To post to this group, send email to caffe...@googlegroups.com.

Atena Nguyen

unread,
Jul 19, 2017, 11:39:09 AM7/19/17
to Caffe Users, ngct...@gmail.com
Yes, the correct name should be hingeloss not hingleloss. My network is now running. 

For the HingLoss layer, since I work with segmentation like (modified FCN net) problem, the output of 'score' is not at the same size as the labels (which is 2D images). That is the reason why i need one softmax layer before it. 

Vào 22:58:07 UTC+9 Thứ Tư, ngày 19 tháng 7 năm 2017, Jonathan R. Williford đã viết:
To unsubscribe from this group and stop receiving emails from it, send an email to caffe-users...@googlegroups.com.

To post to this group, send email to caffe...@googlegroups.com.

mhaoy...@gmail.com

unread,
May 10, 2018, 1:25:00 AM5/10/18
to Caffe Users
do you know how to deloy hingeloss ?
I mean what is it look like while model is in classification ?
how to change model prototype for HingeLoss ?
I think the softmax layer should be changed.


layer {
  name: "prob"
  type: "Softmax"
  bottom: "fc8-"
  top: "prob"
}



在  2017年7月19日星期三 UTC+8下午5:17:27,Atena Nguyen写道:
Reply all
Reply to author
Forward
0 new messages