Hard Swish is a type of activation function based on Swish, but replaces the computationally expensive sigmoid with a piecewise linear analogue:
$$\text{h-swish}\left(x\right) = x\frac{\text{ReLU6}\left(x+3\right)}{6} $$
Source: Searching for MobileNetV3Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Image Classification | 12 | 25.00% |
Object Detection | 8 | 16.67% |
Semantic Segmentation | 4 | 8.33% |
Quantization | 2 | 4.17% |
General Classification | 2 | 4.17% |
Adversarial Attack | 1 | 2.08% |
Speech Recognition | 1 | 2.08% |
Multi-Label Learning | 1 | 2.08% |
reinforcement-learning | 1 | 2.08% |