Activation Functions

Sigmoid Activation

Sigmoid Activations are a type of activation function for neural networks:

$$f\left(x\right) = \frac{1}{\left(1+\exp\left(-x\right)\right)}$$

Some drawbacks of this activation that have been noted in the literature are: sharp damp gradients during backpropagation from deeper hidden layers to inputs, gradient saturation, and slow convergence.


Paper Code Results Date Stars


Task Papers Share
Time Series 38 5.90%
Language Modelling 30 4.66%
Image Classification 20 3.11%
Sentiment Analysis 18 2.80%
Machine Translation 16 2.48%
Object Detection 16 2.48%
Speech Recognition 13 2.02%
Text Generation 12 1.86%
Semantic Segmentation 11 1.71%


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign