Activation Function Synthesis
3 papers with code • 0 benchmarks • 0 datasets
Benchmarks
These leaderboards are used to track progress in Activation Function Synthesis
Most implemented papers
Adaptive hybrid activation function for deep neural networks
The proposed function can be used as a drop-in replacement for ReLU, SiL and Swish activations for deep neural networks and can evolve to one of such functions during the training.
Deep neural network based on F-neurons and its learning
Deep neural networks often employ piece-wise activation functions like ReLU to overcome the effects of exploding and vanishing gradients.
Learnable Extended Activation Function (LEAF) for Deep Neural Networks
This paper introduces Learnable Extended Activation Function (LEAF) - an adaptive activation function that combines the properties of squashing functions and rectifier units.