Activation Functions

Rectified Linear Units

Rectified Linear Units, or ReLUs, are a type of activation function that are linear in the positive dimension, but zero in the negative dimension. The kink in the function is the source of the non-linearity. Linearity in the positive dimension has the attractive property that it prevents non-saturation of gradients (contrast with sigmoid activations), although for half of the real line its gradient is zero.

$$ f\left(x\right) = \max\left(0, x\right) $$

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Semantic Segmentation 56 8.12%
Image Segmentation 33 4.78%
Image Generation 30 4.35%
Decoder 25 3.62%
Denoising 24 3.48%
Image Classification 22 3.19%
Medical Image Segmentation 16 2.32%
Self-Supervised Learning 14 2.03%
Object Detection 14 2.03%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories