Search Results for author: Serhii Kostiuk

Found 2 papers, 2 papers with code

Deep neural network based on F-neurons and its learning

1 code implementation Research Square (pre-print) 2022 Yevgeniy Bodyanskiy, Serhii Kostiuk

Deep neural networks often employ piece-wise activation functions like ReLU to overcome the effects of exploding and vanishing gradients.

Activation Function Synthesis Image Classification

Adaptive hybrid activation function for deep neural networks

1 code implementation System research and information technologies 2022 Yevgeniy Bodyanskiy, Serhii Kostiuk

The proposed function can be used as a drop-in replacement for ReLU, SiL and Swish activations for deep neural networks and can evolve to one of such functions during the training.

Activation Function Synthesis Image Classification +1

Cannot find the paper you are looking for? You can Submit a new open access paper.