Search Results for author: Koen Helwegen

Found 3 papers, 2 papers with code

How Do Adam and Training Strategies Help BNNs Optimization?

no code implementations21 Jun 2021 Zechun Liu, Zhiqiang Shen, Shichao Li, Koen Helwegen, Dong Huang, Kwang-Ting Cheng

We show the regularization effect of second-order momentum in Adam is crucial to revitalize the weights that are dead due to the activation saturation in BNNs.

Larq Compute Engine: Design, Benchmark, and Deploy State-of-the-Art Binarized Neural Networks

1 code implementation18 Nov 2020 Tom Bannink, Arash Bakhtiari, Adam Hillier, Lukas Geiger, Tim de Bruin, Leon Overweel, Jelmer Neeven, Koen Helwegen

We introduce Larq Compute Engine, the world's fastest Binarized Neural Network (BNN) inference engine, and use this framework to investigate several important questions about the efficiency of BNNs and to design a new state-of-the-art BNN architecture.

Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization

3 code implementations NeurIPS 2019 Koen Helwegen, James Widdicombe, Lukas Geiger, Zechun Liu, Kwang-Ting Cheng, Roeland Nusselder

Together, the redefinition of latent weights as inertia and the introduction of Bop enable a better understanding of BNN optimization and open up the way for further improvements in training methodologies for BNNs.

Cannot find the paper you are looking for? You can Submit a new open access paper.