1 code implementation • 29 Oct 2024 • Lior Dikstein, Ariel Lapid, Arnon Netzer, Hai Victor Habi
We analyze existing data generation methods based on batch normalization (BN) matching and identify several gaps between synthetic and real data: 1) Current generation algorithms do not optimize the entire synthetic dataset simultaneously; 2) Data augmentations applied during training are often overlooked; and 3) A distribution shift occurs in the final model layers due to the absence of BN in those layers.
1 code implementation • 20 Sep 2023 • Ofir Gordon, Elad Cohen, Hai Victor Habi, Arnon Netzer
In addition, we leverage the Hessian upper bound to improve the weight quantization parameters selection by focusing on the more sensitive elements in the weight tensors.
1 code implementation • 7 Dec 2022 • Idit Diamant, Roy H. Jennings, Oranit Dror, Hai Victor Habi, Arnon Netzer
We propose to reconcile this conflict by aligning the entropy minimization objective with that of the pseudo labels' cross entropy.
no code implementations • 7 Mar 2022 • Hai Victor Habi, Hagit Messer, Yoram Bresler
The Cram\'er-Rao bound (CRB), a well-known lower bound on the performance of any unbiased parameter estimator, has been used to study a wide variety of problems.
1 code implementation • 19 Sep 2021 • Hai Victor Habi, Reuven Peretz, Elad Cohen, Lior Dikstein, Oranit Dror, Idit Diamant, Roy H. Jennings, Arnon Netzer
Neural network quantization enables the deployment of models on edge devices.
Ranked #1 on Quantization on MS COCO
1 code implementation • 12 Apr 2021 • Idit Diamant, Oranit Dror, Hai Victor Habi, Arnon Netzer
Experimental results on the CMU-Panoptic dataset demonstrate the effectiveness of the suggested framework in generating photo-realistic images of persons with new poses that are more consistent across all views in comparison to a standard Image-to-Image baseline.
2 code implementations • ECCV 2020 • Hai Victor Habi, Roy H. Jennings, Arnon Netzer
In this work, we introduce the Hardware Friendly Mixed Precision Quantization Block (HMQ) in order to meet this requirement.
Ranked #13 on Quantization on ImageNet
1 code implementation • 5 Jul 2019 • Hai Victor Habi, Gil Rafalovich
We propose a method for learning the neural network architecture that based on Genetic Algorithm (GA).