Activation Functions

Exponential Linear Squashing Activation

Introduced by Basirat et al. in The Quest for the Golden Activation Function

The Exponential Linear Squashing Activation Function, or ELiSH, is an activation function used for neural networks. It shares common properties with Swish, being made up of an ELU and a Sigmoid:

$$f\left(x\right) = \frac{x}{1+e^{-x}} \text{ if } x \geq 0 $$ $$f\left(x\right) = \frac{e^{x} - 1}{1+e^{-x}} \text{ if } x < 0 $$

The Sigmoid part of ELiSH improves information flow, while the linear parts solve issues of vanishing gradients.

Source: The Quest for the Golden Activation Function


Paper Code Results Date Stars


Task Papers Share
Image Classification 1 100.00%


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign