Activation Functions

Mish is an activation function for neural networks which can be defined as:

$$ f\left(x\right) = x\cdot\tanh{\text{softplus}\left(x\right)}$$

where

$$\text{softplus}\left(x\right) = \ln\left(1+e^{x}\right)$$

(Compare with functionally similar previously proposed activation functions such as the GELU $x\Phi(x)$ and the SiLU $x\sigma(x)$.)

Source: Mish: A Self Regularized Non-Monotonic Activation Function

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Object Detection 71 15.92%
Object 33 7.40%
Image Classification 21 4.71%
Deep Learning 18 4.04%
Real-Time Object Detection 11 2.47%
Medical Diagnosis 10 2.24%
Semantic Segmentation 9 2.02%
General Classification 9 2.02%
Classification 7 1.57%

Categories