Rectified Linear Units, or ReLUs, are a type of activation function that are linear in the positive dimension, but zero in the negative dimension. The kink in the function is the source of the non-linearity. Linearity in the positive dimension has the attractive property that it prevents non-saturation of gradients (contrast with sigmoid activations), although for half of the real line its gradient is zero.
$$ f\left(x\right) = \max\left(0, x\right) $$
Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Semantic Segmentation | 49 | 9.02% |
Image Classification | 30 | 5.52% |
reinforcement-learning | 20 | 3.68% |
Object Detection | 15 | 2.76% |
Medical Image Segmentation | 14 | 2.58% |
Image-to-Image Translation | 11 | 2.03% |
Super-Resolution | 11 | 2.03% |
Image Generation | 11 | 2.03% |
Denoising | 9 | 1.66% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |