Hello, I am trying a regression problem. My network is not converging , I tried different learning rates, momentum, weight decay. Nothing worked for me. Initially the loss reduced and is now oscillating around 30
The following is from iteration 154000.That is why you do not see the initial loss reduction. The following is the debug info ..Is this values normal, or is it a vanishing gradient problem? I have no clue what is going wrong. I tried with different network architectures(AlexNet,VGG16). I would also try something else. But it would be great if you could suggest something about this.
Thank you very much in advance!
I0526 18:42:20.706812 16062 net.cpp:647] [Backward] Layer conv3, param blob 1 diff: 0.0118202
I0526 18:42:20.706897 16062 net.cpp:636] [Backward] Layer pool2, bottom blob norm2 diff: 0.000126111
I0526 18:42:20.707118 16062 net.cpp:636] [Backward] Layer norm2, bottom blob conv2 diff: 0.000126107
I0526 18:42:20.707175 16062 net.cpp:636] [Backward] Layer relu2, bottom blob conv2 diff: 9.69918e-05
I0526 18:42:20.707670 16062 net.cpp:636] [Backward] Layer conv2, bottom blob pool1 diff: 0.000442692
I0526 18:42:20.707736 16062 net.cpp:647] [Backward] Layer conv2, param blob 0 diff: 0.000754526
I0526 18:42:20.707794 16062 net.cpp:647] [Backward] Layer conv2, param blob 1 diff: 0.0180862
I0526 18:42:20.707870 16062 net.cpp:636] [Backward] Layer pool1, bottom blob norm1 diff: 8.73312e-05
I0526 18:42:20.707976 16062 net.cpp:636] [Backward] Layer norm1, bottom blob pconv1 diff: 8.73285e-05
I0526 18:42:20.708031 16062 net.cpp:636] [Backward] Layer relu1, bottom blob pconv1 diff: 7.55169e-06
I0526 18:42:20.708721 16062 net.cpp:647] [Backward] Layer pconv1, param blob 0 diff: 0.000126193
I0526 18:42:20.708776 16062 net.cpp:647] [Backward] Layer pconv1, param blob 1 diff: 0.00693271