1 code implementation • 23 Jan 2023 • Mahdi Zolnouri, Dounia Lakhmiri, Christophe Tribes, Eyyüb Sari, Sébastien Le Digabel
Training time budget and size of the dataset are among the factors affecting the performance of a Deep Neural Network (DNN).
no code implementations • 22 Dec 2022 • Vahid Partovi Nia, Eyyüb Sari, Vanessa Courville, Masoud Asgharian
Recurrent neural networks (RNN) are the backbone of many text and speech applications.
no code implementations • NeurIPS 2021 • Tim Dockhorn, YaoLiang Yu, Eyyüb Sari, Mahdi Zolnouri, Vahid Partovi Nia
BinaryConnect (BC) and its many variations have become the de facto standard for neural network quantization.
no code implementations • 20 Sep 2021 • Eyyüb Sari, Vanessa Courville, Vahid Partovi Nia
Deploying RNNs that include layer normalization and attention on integer-only arithmetic is still an open problem.
Automatic Speech Recognition Automatic Speech Recognition (ASR) +2
no code implementations • 29 Apr 2020 • Eyyüb Sari, Vahid Partovi Nia
Implementation of quantized neural networks on computing hardware leads to considerable speed up and memory saving.
no code implementations • 26 Sep 2019 • Ryan Razani, Grégoire Morin, Vahid Partovi Nia, Eyyüb Sari
Ternary quantization provides a more flexible model and outperforms binary quantization in terms of accuracy, however doubles the memory footprint and increases the computational cost.
no code implementations • 18 Sep 2019 • Eyyüb Sari, Mouloud Belbahri, Vahid Partovi Nia
Binary Neural Networks (BNNs) are difficult to train, and suffer from drop of accuracy.
no code implementations • 10 Sep 2019 • Ramchalam Kinattinkara Ramakrishnan, Eyyüb Sari, Vahid Partovi Nia
Pruning is one of the most effective model reduction techniques.
no code implementations • 18 Jan 2019 • Mouloud Belbahri, Eyyüb Sari, Sajad Darabi, Vahid Partovi Nia
Using a quasiconvex base function in order to construct a binary quantizer helps training binary neural networks (BNNs) and adding noise to the input data or using a concrete regularization function helps to improve generalization error.