Activation Functions

# ReLU6

Introduced by Howard et al. in MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications

ReLU6 is a modification of the rectified linear unit where we limit the activation to a maximum size of $6$. This is due to increased robustness when used with low-precision computation.

Image Credit: PyTorch

#### Papers

Paper Code Results Date Stars