Activation Functions

Leaky ReLU

Leaky Rectified Linear Unit, or Leaky ReLU, is a type of activation function based on a ReLU, but it has a small slope for negative values instead of a flat slope. The slope coefficient is determined before training, i.e. it is not learnt during training. This type of activation function is popular in tasks where we we may suffer from sparse gradients, for example training generative adversarial networks.

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Image Generation 88 12.96%
Image-to-Image Translation 54 7.95%
Disentanglement 30 4.42%
Image Manipulation 22 3.24%
Domain Adaptation 20 2.95%
Face Generation 19 2.80%
Super-Resolution 19 2.80%
Semantic Segmentation 19 2.80%
Style Transfer 15 2.21%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories