Activation Function Synthesis

3 papers with code • 0 benchmarks • 0 datasets

This task has no description! Would you like to contribute one?

Most implemented papers

Adaptive hybrid activation function for deep neural networks

s-kostyuk/ahaf_activation_pytorch System research and information technologies 2022

The proposed function can be used as a drop-in replacement for ReLU, SiL and Swish activations for deep neural networks and can evolve to one of such functions during the training.

Deep neural network based on F-neurons and its learning

s-kostyuk/f-neuron Research Square (pre-print) 2022

Deep neural networks often employ piece-wise activation functions like ReLU to overcome the effects of exploding and vanishing gradients.

Learnable Extended Activation Function (LEAF) for Deep Neural Networks

s-kostyuk/leaf-aaf International Journal of Computing 2023

This paper introduces Learnable Extended Activation Function (LEAF) - an adaptive activation function that combines the properties of squashing functions and rectifier units.