Tanh Activation is an activation function used for neural networks:
$$f\left(x\right) = \frac{e^{x} - e^{-x}}{e^{x} + e^{-x}}$$
Historically, the tanh function became preferred over the sigmoid function as it gave better performance for multi-layer neural networks. But it did not solve the vanishing gradient problem that sigmoids suffered, which was tackled more effectively with the introduction of ReLU activations.
Image Source: Junxi Feng
Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Prediction | 27 | 3.40% |
Deep Learning | 24 | 3.02% |
Sentiment Analysis | 21 | 2.64% |
Time Series Forecasting | 19 | 2.39% |
Translation | 16 | 2.02% |
Management | 15 | 1.89% |
Computational Efficiency | 15 | 1.89% |
Decoder | 13 | 1.64% |
Language Modelling | 13 | 1.64% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |