It seems using SeLu activation with Le_cun kernel initializer and normalisation of the input has a similar normalising effect, as suggested by Klaumbauer et.al.
On my 5 layer CNN i actually get similar performance to BN and much faster training.
Hovever SELU is not currently supported in TFlite micro, it seems only RELU is currently availble, in https://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/micro/kernels/activations.cc
Can anyone suggest how implementation of an activation function like SELU should be initiated ?
I would guess it is less complex than implementing BatchNorm for inference
--
You received this message because you are subscribed to the Google Groups "SIG Micro" group.
To unsubscribe from this group and stop receiving emails from it, send an email to micro+un...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/micro/12970775-e994-4a93-9712-d70e9a09de6bn%40tensorflow.org.