Activation Functions

# Hard Sigmoid

Introduced by Courbariaux et al. in BinaryConnect: Training Deep Neural Networks with binary weights during propagations

The Hard Sigmoid is an activation function used for neural networks of the form:

$$f\left(x\right) = \max\left(0, \min\left(1,\frac{\left(x+1\right)}{2}\right)\right)$$

Image Source: Rinat Maksutov

#### Papers

Paper Code Results Date Stars