ReLU6 is a modification of the rectified linear unit where we limit the activation to a maximum size of $6$. This is due to increased robustness when used with low-precision computation.
Image Credit: PyTorch
Source: MobileNets: Efficient Convolutional Neural Networks for Mobile Vision ApplicationsPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Image Classification | 15 | 17.05% |
Object Detection | 10 | 11.36% |
Quantization | 5 | 5.68% |
Semantic Segmentation | 5 | 5.68% |
Classification | 4 | 4.55% |
Test | 3 | 3.41% |
Neural Network Compression | 2 | 2.27% |
Bayesian Optimization | 2 | 2.27% |
Network Pruning | 2 | 2.27% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |