Scaled Exponential Linear Units, or SELUs, are activation functions that induce self-normalizing properties.
The SELU activation function is given by
$$f\left(x\right) = \lambda{x} \text{ if } x \geq{0}$$ $$f\left(x\right) = \lambda{\alpha\left(\exp\left(x\right) -1 \right)} \text{ if } x < 0 $$
with $\alpha \approx 1.6733$ and $\lambda \approx 1.0507$.
Source: Self-Normalizing Neural NetworksPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Image Classification | 18 | 9.84% |
Quantization | 12 | 6.56% |
Classification | 7 | 3.83% |
Speech Recognition | 6 | 3.28% |
Gesture Recognition | 6 | 3.28% |
Object Detection | 6 | 3.28% |
Optical Flow Estimation | 5 | 2.73% |
Audio Classification | 5 | 2.73% |
Edge-computing | 4 | 2.19% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |