Tanh Activation is an activation function used for neural networks:
$$f\left(x\right) = \frac{e^{x} - e^{-x}}{e^{x} + e^{-x}}$$
Historically, the tanh function became preferred over the sigmoid function as it gave better performance for multi-layer neural networks. But it did not solve the vanishing gradient problem that sigmoids suffered, which was tackled more effectively with the introduction of ReLU activations.
Image Source: Junxi Feng
| Paper | Code | Results | Date | Stars |
|---|
| Task | Papers | Share |
|---|---|---|
| Language Modelling | 23 | 3.44% |
| Time Series Forecasting | 16 | 2.39% |
| Sentiment Analysis | 16 | 2.39% |
| Management | 15 | 2.24% |
| Classification | 14 | 2.09% |
| Decision Making | 14 | 2.09% |
| Machine Translation | 13 | 1.94% |
| Object Detection | 11 | 1.64% |
| Speech Recognition | 11 | 1.64% |
| Component | Type |
|
|---|---|---|
| 🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |