Dear all,
I'm fine-tuning previously trained network.
Now I see that validaton loss start increase while training loss constatnly decreases.
I know that it's probably overfitting, but validation loss start increase after first epoch ended.
I use batch size=24 and training set=500k images, so 1 epoch = 20 000 iterations.
Could overfitting start so soon? It seems to me that there some another reason.
Thanks