Dear Dr. Artrith
I was reading this paper and noticed that You have included ReLU and GeLU as activation functions.
However, looking through the github of the source code of aenet I could not find those new activation functions?
Is it a part of aenet-Pytorch only? How can I have access to a aenet version with those new activation functions implemented?