Activation Functions

# Parametric Exponential Linear Unit

Introduced by Trottier et al. in Parametric Exponential Linear Unit for Deep Convolutional Neural Networks

Parameterized Exponential Linear Units, or PELU, is an activation function for neural networks. It involves learning a parameterization of ELU in order to learn the proper activation shape at each layer in a CNN.

The PELU has two additional parameters over the ELU:

$$f\left(x\right) = cx \text{ if } x > 0$$ $$f\left(x\right) = \alpha\exp^{\frac{x}{b}} - 1 \text{ if } x \leq 0$$

Where $a$, $b$, and $c > 0$. Here $c$ causes a change in the slope in the positive quadrant, $b$ controls the scale of the exponential decay, and $\alpha$ controls the saturation in the negative quadrant.

Source: Activation Functions

#### Papers

Paper Code Results Date Stars