Validation Loss higher than Training Loss

789 views
Skip to first unread message

Caleb Belth

unread,
Jul 20, 2016, 8:07:19 PM7/20/16
to Caffe Users
I'm training my own dataset. My validation loss began at 1.65334 and has dropped to 0.35824 at 10,000 iterations and appears to have settled. However, my training loss began at 1.84388 and has dropped to 0.0420767. Is this a sign of overfitting? If not, what would lead to the validation loss being significantly higher than the training loss?

Roger

unread,
Jul 21, 2016, 4:28:26 PM7/21/16
to Caffe Users
It is probably overfitting,
are you using some regularization like dropout or so?

Caleb Belth

unread,
Jul 21, 2016, 5:58:09 PM7/21/16
to Caffe Users
Thanks for the response. Yes, I'm using CaffeNet which has dropout layers. 

Bartosz Ludwiczuk

unread,
Jul 22, 2016, 4:17:38 AM7/22/16
to Caffe Users
There are many ways to prevent overfitting, I will list some of them:
1. Dropout: remember that you can use dropout between ConvLayer too, maybe value ~0.2
2. BatchNorm: it reduce overfitting too
3. Data Augumentation: add rotation, cropping, mirror, color manipulation
4. Decrease power of model: get smaller number of parameters. By removing a layers or replacing FC layer by ConvNet
5. Model Ensemble: training several model and getting average of output always decrease validation error.
Reply all
Reply to author
Forward
0 new messages