1 code implementation • 1 Jan 2021 • Ahmed T. Elthakeb, Prannoy Pilligundla, Tarek Elgindi, FatemehSadat Mireshghallah, Charles-Alban Deledalle, Hadi Esmaeilzadeh
We show how WaveQ balance compute efficiency and accuracy, and provide a heterogeneous bitwidth assignment for quantization of a large variety of deep networks (AlexNet, CIFAR-10, MobileNet, ResNet-18, ResNet-20, SVHN, and VGG-11) that virtually preserves the accuracy.
no code implementations • 29 Feb 2020 • Ahmed T. Elthakeb, Prannoy Pilligundla, FatemehSadat Mireshghallah, Tarek Elgindi, Charles-Alban Deledalle, Hadi Esmaeilzadeh
We show how SINAREQ balance compute efficiency and accuracy, and provide a heterogeneous bitwidth assignment for quantization of a large variety of deep networks (AlexNet, CIFAR-10, MobileNet, ResNet-18, ResNet-20, SVHN, and VGG-11) that virtually preserves the accuracy.
no code implementations • ICML 2020 • Ahmed T. Elthakeb, Prannoy Pilligundla, Alex Cloninger, Hadi Esmaeilzadeh
The deep layers of modern neural networks extract a rather rich set of features as an input propagates through the network.
no code implementations • 4 May 2019 • Ahmed T. Elthakeb, Prannoy Pilligundla, Hadi Esmaeilzadeh
To further mitigate this loss, we propose a novel sinusoidal regularization, called SinReQ1, for deep quantized training.
no code implementations • 5 Nov 2018 • Ahmed T. Elthakeb, Prannoy Pilligundla, FatemehSadat Mireshghallah, Amir Yazdanbakhsh, Hadi Esmaeilzadeh
We show how ReLeQ can balance speed and quality, and provide an asymmetric general solution for quantization of a large variety of deep networks (AlexNet, CIFAR-10, LeNet, MobileNet-V1, ResNet-20, SVHN, and VGG-11) that virtually preserves the accuracy (=< 0. 3% loss) while minimizing the computation and storage cost.