Activation Functions

# SERLU

Introduced by Zhang et al. in Effectiveness of Scaled Exponentially-Regularized Linear Units (SERLUs)

SERLU, or Scaled Exponentially-Regularized Linear Unit, is a type of activation function. The new function introduces a bump-shaped function in the region of negative input. The bump-shaped function has approximately zero response to large negative input while being able to push the output of SERLU towards zero mean statistically.

$$\text{SERLU}\left(x\right)) = \lambda_{serlu}x \text{ if } x \geq 0$$ $$\text{SERLU}\left(x\right)) = \lambda_{serlu}\alpha_{serlu}xe^{x} \text{ if } x < 0$$

where the two parameters $\lambda_{serlu} > 0$ and $\alpha_{serlu} > 0$ remain to be specified.

#### Papers

Paper Code Results Date Stars

#### Components

Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign