Activation Functions

Leaky ReLU

Leaky Rectified Linear Unit, or Leaky ReLU, is a type of activation function based on a ReLU, but it has a small slope for negative values instead of a flat slope. The slope coefficient is determined before training, i.e. it is not learnt during training. This type of activation function is popular in tasks where we may suffer from sparse gradients, for example training generative adversarial networks.


Paper Code Results Date Stars


Task Papers Share
Image Generation 87 12.72%
Image-to-Image Translation 50 7.31%
Disentanglement 27 3.95%
Face Generation 20 2.92%
Image Manipulation 19 2.78%
Style Transfer 17 2.49%
Domain Adaptation 17 2.49%
Super-Resolution 17 2.49%
Semantic Segmentation 17 2.49%


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign