Activation Functions

Hard Swish

Introduced by Howard et al. in Searching for MobileNetV3

Hard Swish is a type of activation function based on Swish, but replaces the computationally expensive sigmoid with a piecewise linear analogue:

$$\text{h-swish}\left(x\right) = x\frac{\text{ReLU6}\left(x+3\right)}{6} $$

Source: Searching for MobileNetV3

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Image Classification 14 13.33%
Object Detection 9 8.57%
Quantization 6 5.71%
Classification 6 5.71%
Decoder 6 5.71%
Semantic Segmentation 5 4.76%
Computational Efficiency 3 2.86%
Bayesian Optimization 3 2.86%
Image Segmentation 2 1.90%

Components


Component Type
ReLU6
Activation Functions

Categories