1 code implementation • Research Square (pre-print) 2022 • Yevgeniy Bodyanskiy, Serhii Kostiuk
Deep neural networks often employ piece-wise activation functions like ReLU to overcome the effects of exploding and vanishing gradients.
1 code implementation • System research and information technologies 2022 • Yevgeniy Bodyanskiy, Serhii Kostiuk
The proposed function can be used as a drop-in replacement for ReLU, SiL and Swish activations for deep neural networks and can evolve to one of such functions during the training.