Activation Functions

The Hard Sigmoid is an activation function used for neural networks of the form:

$$f\left(x\right) = \max\left(0, \min\left(1,\frac{\left(x+1\right)}{2}\right)\right)$$

Image Source: Rinat Maksutov

Source: BinaryConnect: Training Deep Neural Networks with binary weights during propagations

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Clustering 1 25.00%
Image Segmentation 1 25.00%
Semantic Segmentation 1 25.00%
Network Pruning 1 25.00%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories