Is Caffe deterministic with a given random seed?

2,148 views
Skip to first unread message

Henggang Cui

unread,
Jul 8, 2015, 8:15:44 PM7/8/15
to caffe...@googlegroups.com
Hi,

I find there's a "random_seed" parameter that we can set in solver.prototxt. I think Caffe will use this random seed for all random number generation through out the execution, so the results should be consistent from run to run.

However, I find it is not true. I set the same "random_seed" parameter, but the training loss and testing accuracies are still different from run to run. What else can cause Caffe to be non-deterministic?

Thanks,
Cui

Andrei Pokrovsky

unread,
Aug 4, 2015, 12:49:57 AM8/4/15
to Caffe Users

Are you training on GPU?

One reason could be the use of atomicAdd in convolution kernels which can produce different sum results from the same numbers depending on the order of summation.

Andrei Pokrovsky

unread,
Aug 4, 2015, 12:53:49 AM8/4/15
to Caffe Users

Some other reasons can be that you use different CPUs or GPUs or different number of threads for the same training runs.
It's quite difficult to get exactly deterministic behavior from different hardware but at least on the same HW i think Caffe should support specifying a mode requiring that the results from all it's implementations are deterministic.


On Wednesday, July 8, 2015 at 1:15:44 PM UTC-7, Henggang Cui wrote:

ngc...@gmail.com

unread,
Aug 5, 2015, 8:17:53 AM8/5/15
to Caffe Users
If you use CuDNN that may also be the reason.
Reply all
Reply to author
Forward
0 new messages