Tanh Activation is an activation function used for neural networks:
$$f\left(x\right) = \frac{e^{x} - e^{-x}}{e^{x} + e^{-x}}$$
Historically, the tanh function became preferred over the sigmoid function as it gave better performance for multi-layer neural networks. But it did not solve the vanishing gradient problem that sigmoids suffered, which was tackled more effectively with the introduction of ReLU activations.
Image Source: Junxi Feng
Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Language Modelling | 25 | 3.70% |
Sentiment Analysis | 17 | 2.52% |
Time Series Forecasting | 17 | 2.52% |
Management | 15 | 2.22% |
Decision Making | 14 | 2.07% |
Machine Translation | 13 | 1.93% |
Classification | 13 | 1.93% |
Object Detection | 11 | 1.63% |
Speech Recognition | 11 | 1.63% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |