About some new activation function?

106 views
Skip to first unread message

shunfei chen

unread,
May 9, 2018, 2:51:42 AM5/9/18
to kaldi-help
Hello.
      I recently read some papers about activation function as follows:
1)Self-Normalizing Neural Network
2)Investigative study of various activation functions for speech recognition
3)SEARCHING FOR ACTIVATION FUNCTIONS

some new activation function (such as: ELU, SELU, Swish) will have a better performance than relu. Has anyone tried them in speech recognition or kaldi ? I implemented the SELU activation function in kaldi,but the  result is very bad in myself data set as follows(Especially when in chain model):
1) nnet3:  relu:22.88% selu: 24.68%
2)chain:   relu:22.41% selu: 39.46%

Daniel Povey

unread,
May 9, 2018, 3:05:58 PM5/9/18
to kaldi-help
I have tried swish and didn't see any consistent improvement; I may
have tried SELU with the same result, but I'm not 100% sure.
I think maybe the reason those activation functions haven't really
caught on is that while they may help on certain toy tasks, they don't
help on larger tasks for some reason.

Dan
> --
> Go to http://kaldi-asr.org/forums.html find out how to join
> ---
> You received this message because you are subscribed to the Google Groups
> "kaldi-help" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to kaldi-help+...@googlegroups.com.
> To post to this group, send email to kaldi...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/kaldi-help/361c7b10-2c7c-40f3-9881-c7fce19e259b%40googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.

shunfei chen

unread,
May 9, 2018, 10:10:57 PM5/9/18
to kaldi-help
Thank you, Dan. I will try some tests on larger tasks.

在 2018年5月10日星期四 UTC+8上午3:05:58,Dan Povey写道:
Reply all
Reply to author
Forward
0 new messages