The Hard Sigmoid is an activation function used for neural networks of the form:
$$f\left(x\right) = \max\left(0, \min\left(1,\frac{\left(x+1\right)}{2}\right)\right)$$
Image Source: Rinat Maksutov
Source: BinaryConnect: Training Deep Neural Networks with binary weights during propagationsPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Image Segmentation | 1 | 33.33% |
Semantic Segmentation | 1 | 33.33% |
Network Pruning | 1 | 33.33% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |