Alternatives to using BatchNormalisation in TFLiteMicro

163 views
Skip to first unread message

morten opprud

unread,
Feb 15, 2021, 8:33:51 AM2/15/21
to SIG Micro
Has anyone found alternative to using BatchNormalisation ?

I have a 5 layer CNN, that performs well during traning/test when using BatchNorm, in each layer, and it peforms poorly without BN.

br
Morten Opprud 


morten opprud

unread,
Feb 16, 2021, 2:25:34 PM2/16/21
to SIG Micro, morten opprud

It seems using SeLu activation with Le_cun kernel initializer and normalisation of the input has a similar normalising effect, as suggested by Klaumbauer et.al.

On my 5 layer CNN i actually get similar performance to BN and much faster training.
Hovever SELU is not currently supported in TFlite micro, it seems only RELU is currently availble, in https://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/micro/kernels/activations.cc

Can anyone suggest how implementation of an activation function like SELU should be initiated ? 
I would guess it is less complex than implementing BatchNorm for inference

Zhe HE

unread,
Feb 17, 2021, 12:12:56 PM2/17/21
to morten opprud, SIG Micro
Isn't that just multiplication and addition? You can just replace BN with those two ops. But you need to set the parameters correctly.

ZH

--
You received this message because you are subscribed to the Google Groups "SIG Micro" group.
To unsubscribe from this group and stop receiving emails from it, send an email to micro+un...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/micro/12970775-e994-4a93-9712-d70e9a09de6bn%40tensorflow.org.
Reply all
Reply to author
Forward
0 new messages