Alternatives to using BatchNormalisation in TFLiteMicro

Skip to first unread message

morten opprud

Feb 15, 2021, 8:33:51 AM2/15/21
to SIG Micro
Has anyone found alternative to using BatchNormalisation ?

I have a 5 layer CNN, that performs well during traning/test when using BatchNorm, in each layer, and it peforms poorly without BN.

Morten Opprud 

morten opprud

Feb 16, 2021, 2:25:34 PM2/16/21
to SIG Micro, morten opprud

It seems using SeLu activation with Le_cun kernel initializer and normalisation of the input has a similar normalising effect, as suggested by Klaumbauer

On my 5 layer CNN i actually get similar performance to BN and much faster training.
Hovever SELU is not currently supported in TFlite micro, it seems only RELU is currently availble, in

Can anyone suggest how implementation of an activation function like SELU should be initiated ? 
I would guess it is less complex than implementing BatchNorm for inference

Zhe HE

Feb 17, 2021, 12:12:56 PM2/17/21
to morten opprud, SIG Micro
Isn't that just multiplication and addition? You can just replace BN with those two ops. But you need to set the parameters correctly.


You received this message because you are subscribed to the Google Groups "SIG Micro" group.
To unsubscribe from this group and stop receiving emails from it, send an email to
To view this discussion on the web visit
Reply all
Reply to author
0 new messages