Sigmoid Activations are a type of activation function for neural networks:
$$f\left(x\right) = \frac{1}{\left(1+\exp\left(-x\right)\right)}$$
Some drawbacks of this activation that have been noted in the literature are: sharp damp gradients during backpropagation from deeper hidden layers to inputs, gradient saturation, and slow convergence.
Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Deep Learning | 25 | 3.13% |
Computational Efficiency | 21 | 2.63% |
Prediction | 19 | 2.38% |
Sentiment Analysis | 18 | 2.25% |
Time Series Forecasting | 17 | 2.13% |
Decoder | 16 | 2.00% |
Object Detection | 15 | 1.88% |
Management | 13 | 1.63% |
Translation | 13 | 1.63% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |