Hard Swish is a type of activation function based on Swish, but replaces the computationally expensive sigmoid with a piecewise linear analogue:
$$\text{h-swish}\left(x\right) = x\frac{\text{ReLU6}\left(x+3\right)}{6} $$
Source: Searching for MobileNetV3Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Image Classification | 12 | 13.79% |
Object Detection | 9 | 10.34% |
Classification | 6 | 6.90% |
Quantization | 5 | 5.75% |
Semantic Segmentation | 4 | 4.60% |
Bayesian Optimization | 3 | 3.45% |
Neural Network Compression | 2 | 2.30% |
Network Pruning | 2 | 2.30% |
Reinforcement Learning (RL) | 2 | 2.30% |