Activation Functions

Scaled Exponential Linear Unit

Introduced by Klambauer et al. in Self-Normalizing Neural Networks

Scaled Exponential Linear Units, or SELUs, are activation functions that induce self-normalizing properties.

The SELU activation function is given by

$$f\left(x\right) = \lambda{x} \text{ if } x \geq{0}$$ $$f\left(x\right) = \lambda{\alpha\left(\exp\left(x\right) -1 \right)} \text{ if } x < 0 $$

with $\alpha \approx 1.6733$ and $\lambda \approx 1.0507$.

Source: Self-Normalizing Neural Networks

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Image Classification 18 8.82%
Quantization 12 5.88%
Classification 7 3.43%
Speech Recognition 6 2.94%
Gesture Recognition 6 2.94%
Object Detection 6 2.94%
Optical Flow Estimation 5 2.45%
Audio Classification 5 2.45%
Object 5 2.45%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories