Rectified Linear Units, or ReLUs, are a type of activation function that are linear in the positive dimension, but zero in the negative dimension. The kink in the function is the source of the non-linearity. Linearity in the positive dimension has the attractive property that it prevents non-saturation of gradients (contrast with sigmoid activations), although for half of the real line its gradient is zero.
$$ f\left(x\right) = \max\left(0, x\right) $$
| Paper | Code | Results | Date | Stars |
|---|
| Task | Papers | Share |
|---|---|---|
| Semantic Segmentation | 52 | 8.47% |
| Image Segmentation | 34 | 5.54% |
| Image Classification | 24 | 3.91% |
| Denoising | 23 | 3.75% |
| Classification | 21 | 3.42% |
| Medical Image Segmentation | 19 | 3.09% |
| Object Detection | 18 | 2.93% |
| Self-Supervised Learning | 14 | 2.28% |
| Image Generation | 14 | 2.28% |
| Component | Type |
|
|---|---|---|
| 🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |