ReLU6 is a modification of the rectified linear unit where we limit the activation to a maximum size of $6$. This is due to increased robustness when used with low-precision computation.
Image Credit: PyTorch
Source: MobileNets: Efficient Convolutional Neural Networks for Mobile Vision ApplicationsPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Image Classification | 13 | 24.53% |
Object Detection | 9 | 16.98% |
Semantic Segmentation | 5 | 9.43% |
Quantization | 2 | 3.77% |
General Classification | 2 | 3.77% |
Adversarial Attack | 1 | 1.89% |
Speech Recognition | 1 | 1.89% |
Multi-Label Learning | 1 | 1.89% |
reinforcement-learning | 1 | 1.89% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |