Using ReLU and GeLU as activation Function

13 views
Skip to first unread message

Gabriel Bruno Garcia de Souza

unread,
Oct 8, 2025, 9:42:43 AM10/8/25
to aenet
Dear Dr. Artrith

I was reading this paper and noticed that You have included ReLU and GeLU as activation functions. 

However, looking through the github of the source code of aenet I could not find those new activation functions? 

Is it a part of aenet-Pytorch only? How can I have access to a aenet version with those new activation functions implemented?
2101.10468v2.pdf
Reply all
Reply to author
Forward
0 new messages