The Hard Sigmoid is an activation function used for neural networks of the form:

$$f\left(x\right) = \max\left(0, \min\left(1,\frac{\left(x+1\right)}{2}\right)\right)$$

Image Source: Rinat Maksutov

Source: BinaryConnect: Training Deep Neural Networks with binary weights during propagations

Latest Papers

PAPER DATE
AdaptiveReID: Adaptive L2 Regularization in Person Re-Identification
| Xingyang NiLiang FangHeikki Huttunen
2020-07-15
MaskConvNet: Training Efficient ConvNets from Scratch via Budget-constrained Filter Pruning
Raden Mu'az Mun'imJie LinVijay ChandrasekharKoichi Shinoda
2020-01-01
$L_0$-ARM: Network Sparsification via Stochastic Binary Optimization
| Yang LiShihao Ji
2019-04-09
BinaryConnect: Training Deep Neural Networks with binary weights during propagations
| Matthieu CourbariauxYoshua BengioJean-Pierre David
2015-11-02

Tasks

TASK PAPERS SHARE
Person Re-Identification 1 50.00%
Network Pruning 1 50.00%

Components

COMPONENT TYPE
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories