1 code implementation • 24 May 2022 • Shahaf E. Finder, Yair Zohav, Maor Ashkenazi, Eran Treister
Convolutional Neural Networks (CNNs) are known for requiring extensive computational resources, and quantization is among the best and most common methods for compressing them.
1 code implementation • CVPR 2022 • Meitar Ronen, Shahaf E. Finder, Oren Freifeld
Using a split/merge framework, a dynamic architecture that adapts to the changing K, and a novel loss, our proposed method outperforms existing nonparametric methods (both classical and deep ones).
Ranked #7 on Unsupervised Image Classification on ImageNet
1 code implementation • 18 May 2020 • Shahaf E. Finder, Eran Treister, Oren Freifeld
However, we show that even for a single Gaussian, when GLASSO is tuned to successfully estimate the sparsity pattern, it does so at the price of a substantial bias of the values of the nonzero entries of the matrix, and we show that this problem only worsens in a mixture setting.