Activation Functions

Parametric Exponential Linear Unit

Introduced by Trottier et al. in Parametric Exponential Linear Unit for Deep Convolutional Neural Networks

Parameterized Exponential Linear Units, or PELU, is an activation function for neural networks. It involves learning a parameterization of ELU in order to learn the proper activation shape at each layer in a CNN.

The PELU has two additional parameters over the ELU:

$$ f\left(x\right) = cx \text{ if } x > 0 $$ $$ f\left(x\right) = \alpha\exp^{\frac{x}{b}} - 1 \text{ if } x \leq 0 $$

Where $a$, $b$, and $c > 0$. Here $c$ causes a change in the slope in the positive quadrant, $b$ controls the scale of the exponential decay, and $\alpha$ controls the saturation in the negative quadrant.

Source: Activation Functions

Source: Parametric Exponential Linear Unit for Deep Convolutional Neural Networks

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Atari Games 1 20.00%
Image Classification 1 20.00%
Reinforcement Learning (RL) 1 20.00%
Object Recognition 1 20.00%
Scene Understanding 1 20.00%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories