Activation Functions

Hard Swish

Introduced by Howard et al. in Searching for MobileNetV3

Hard Swish is a type of activation function based on Swish, but replaces the computationally expensive sigmoid with a piecewise linear analogue:

$$\text{h-swish}\left(x\right) = x\frac{\text{ReLU6}\left(x+3\right)}{6} $$

Source: Searching for MobileNetV3

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Image Classification 12 15.00%
Object Detection 9 11.25%
Quantization 5 6.25%
Classification 4 5.00%
Semantic Segmentation 4 5.00%
Test 3 3.75%
Neural Network Compression 2 2.50%
Bayesian Optimization 2 2.50%
Network Pruning 2 2.50%

Components


Component Type
ReLU6
Activation Functions

Categories