Forward and Backward Information Retention for Accurate Binary Neural Networks

24 Sep 2019Haotong QinRuihao GongXianglong LiuMingzhu ShenZiran WeiFengwei YuJingkuan Song

Weight and activation binarization is an effective approach to deep neural network compression and can accelerate the inference by leveraging bitwise operations. Although many binarization methods have improved the accuracy of the model by minimizing the quantization error in forward propagation, there remains a noticeable performance gap between the binarized model and the full-precision one... (read more)

PDF Abstract

Evaluation Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.