ReLU6 is a modification of the rectified linear unit where we limit the activation to a maximum size of $6$. This is due to increased robustness when used with low-precision computation.
Image Credit: PyTorchSource: MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
|🤖 No Components Found||You can add them if they exist; e.g. Mask R-CNN uses RoIAlign|