Activation Functions

Mish is an activation function for neural networks which can be defined as:

$$ f\left(x\right) = x\cdot\tanh{\text{softplus}\left(x\right)}$$

where

$$\text{softplus}\left(x\right) = \ln\left(1+e^{x}\right)$$

(Compare with functionally similar previously proposed activation functions such as the GELU $x\Phi(x)$ and the SiLU $x\sigma(x)$.)

Source: Mish: A Self Regularized Non-Monotonic Activation Function

Papers


Paper Code Results Date Stars

Categories