Mish is an activation function for neural networks which can be defined as:
$$ f\left(x\right) = x\cdot\tanh{\text{softplus}\left(x\right)}$$
where
$$\text{softplus}\left(x\right) = \ln\left(1+e^{x}\right)$$
(Compare with functionally similar previously proposed activation functions such as the GELU $x\Phi(x)$ and the SiLU $x\sigma(x)$.)
Source: Mish: A Self Regularized Non-Monotonic Activation FunctionPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Object Detection | 67 | 18.82% |
Image Classification | 21 | 5.90% |
Real-Time Object Detection | 10 | 2.81% |
Medical Diagnosis | 10 | 2.81% |
Semantic Segmentation | 9 | 2.53% |
General Classification | 9 | 2.53% |
Classification | 7 | 1.97% |
Autonomous Driving | 7 | 1.97% |
Object Tracking | 5 | 1.40% |