Tanh Activation is an activation function used for neural networks:
$$f\left(x\right) = \frac{e^{x} - e^{-x}}{e^{x} + e^{-x}}$$
Historically, the tanh function became preferred over the sigmoid function as it gave better performance for multi-layer neural networks. But it did not solve the vanishing gradient problem that sigmoids suffered, which was tackled more effectively with the introduction of ReLU activations.
Image Source: Junxi Feng
Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Language Modelling | 24 | 3.43% |
Decoder | 20 | 2.86% |
Time Series Forecasting | 19 | 2.72% |
Sentence | 18 | 2.58% |
Management | 16 | 2.29% |
Decision Making | 15 | 2.15% |
Image Generation | 14 | 2.00% |
Classification | 14 | 2.00% |
Sentiment Analysis | 13 | 1.86% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |