no code implementations • 25 Jan 2024 • Yaniv Blumenfeld, Itay Hubara, Daniel Soudry
The majority of the research on the quantization of Deep Neural Networks (DNNs) is focused on reducing the precision of tensors visible by high-level frameworks (e. g., weights, activations, and gradients).
no code implementations • NeurIPS 2023 • Chen Zeno, Greg Ongie, Yaniv Blumenfeld, Nir Weinberger, Daniel Soudry
Neural network (NN) denoisers are an essential building block in many common tasks, ranging from image reconstruction to image generation.
1 code implementation • ICML 2020 • Yaniv Blumenfeld, Dar Gilboa, Daniel Soudry
Deep neural networks are typically initialized with random weights, with variances chosen to facilitate signal propagation and stable gradients.
1 code implementation • 11 Dec 2019 • Yaniv Blumenfeld, Dar Gilboa, Daniel Soudry
Standard practice in training neural networks involves initializing the weights in an independent fashion.
1 code implementation • NeurIPS 2019 • Yaniv Blumenfeld, Dar Gilboa, Daniel Soudry
Reducing the precision of weights and activation functions in neural network training, with minimal impact on performance, is essential for the deployment of these models in resource-constrained environments.