Hard Swish is a type of activation function based on Swish, but replaces the computationally expensive sigmoid with a piecewise linear analogue:
$$\text{h-swish}\left(x\right) = x\frac{\text{ReLU6}\left(x+3\right)}{6} $$
Source: Searching for MobileNetV3Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Image Classification | 14 | 13.33% |
Object Detection | 9 | 8.57% |
Quantization | 6 | 5.71% |
Classification | 6 | 5.71% |
Decoder | 6 | 5.71% |
Semantic Segmentation | 5 | 4.76% |
Computational Efficiency | 3 | 2.86% |
Bayesian Optimization | 3 | 2.86% |
Image Segmentation | 2 | 1.90% |