Geometric Monomial (GEM): a family of rational 2N-differentiable activation functions
📰 ArXiv cs.AI
arXiv:2604.21677v1 Announce Type: cross Abstract: The choice of activation function plays a crucial role in the optimization and performance of deep neural networks. While the Rectified Linear Unit (ReLU) remains the dominant choice due to its simplicity and effectiveness, its lack of smoothness may hinder gradient-based optimization in deep architectures. In this work we propose a family of $C^{2N}$-smooth activation functions whose gate follows a log-logistic CDF, achieving ReLU-like performan
DeepCamp AI