implementation of backprop algorithm in batch normalization layer

59 views
Skip to first unread message

Ivan Krijan

unread,
Sep 2, 2020, 4:03:45 AM9/2/20
to TensorFlow Community Testing
[cross-posting from TensorFlow Developers]

Hi all!

[intro]
We used TensorFlow (v 2.1.0) to develop model of neural network that's good for our purposes.
We concluded that our model is pretty simple, and that we can implement it on our own i C++ (as header only).

We did our implementation and compared our results with that from tensorflow.
Of course, we have some differences.
We managed to find out that our output (in backpropagation) of BN layer isn't same as that one in tensorflow.

So, we are interested in knowing exact implementation of this algorithm in tensorflow.

[QUESTION]
Is the backpropagation in BN layer equvivalent to tthat one explained in this article: https://arxiv.org/pdf/1502.03167.pdf (page 4)?

To be more precise, are the needed derivations computed in exactly that way?
Or is there perhaps added some small value somewhere for aditional stability or something like that?

I found that implementation is in the file "batch_norm_op.h" (v. 2.1.0), https://github.com/tensorflow/tensorflow/blob/v2.1.0/tensorflow/core/kernels/batch_norm_op.h

Please, any help on this issue will be more than helpful.

Thanks very much in advance!

Best regards,
Ivan
Reply all
Reply to author
Forward
0 new messages