1 code implementation • ICCV 2023 • Natalia Frumkin, Dibakar Gope, Diana Marculescu
Evol-Q improves the top-1 accuracy of a fully quantized ViT-Base by $10. 30\%$, $0. 78\%$, and $0. 15\%$ for $3$-bit, $4$-bit, and $8$-bit weight quantization levels.
no code implementations • 5 Dec 2022 • Hung-Yueh Chiang, Natalia Frumkin, Feng Liang, Diana Marculescu
MobileTL trains the shifts for internal normalization layers to avoid storing activation maps for the backward pass.
no code implementations • 17 Nov 2022 • Natalia Frumkin, Dibakar Gope, Diana Marculescu
Borrowing the idea of contrastive loss from self-supervised learning, we find a robust way to jointly minimize a loss function using just 1, 000 calibration images.