no code implementations • 5 Feb 2024 • Ran Ben-Basat, Yaniv Ben-Itzhak, Michael Mitzenmacher, Shay Vargaftik
Quantization is a fundamental optimization for many machine-learning use cases, including compressing gradients, model weights and activations, and datasets.
no code implementations • 1 Feb 2023 • Ron Dorfman, Shay Vargaftik, Yaniv Ben-Itzhak, Kfir Y. Levy
Many compression techniques have been proposed to reduce the communication overhead of Federated Learning training procedures.
no code implementations • 13 Oct 2022 • Yaniv Ben-Itzhak, Helen Möllering, Benny Pinkas, Thomas Schneider, Ajith Suresh, Oleksandr Tkachenko, Shay Vargaftik, Christian Weinert, Hossein Yalame, Avishay Yanai
In this paper, we unite both research directions by introducing ScionFL, the first secure aggregation framework for FL that operates efficiently on quantized inputs and simultaneously provides robustness against malicious clients.
no code implementations • 26 May 2022 • Ran Ben Basat, Shay Vargaftik, Amit Portnoy, Gil Einziger, Yaniv Ben-Itzhak, Michael Mitzenmacher
Distributed Mean Estimation (DME), in which $n$ clients communicate vectors to a parameter server that estimates their average, is a fundamental building block in communication-efficient federated learning.
1 code implementation • 18 May 2022 • Changgang Zheng, Mingyuan Zang, Xinpeng Hong, Riyad Bensoussane, Shay Vargaftik, Yaniv Ben-Itzhak, Noa Zilberman
To date, no general solution has been provided for mapping machine learning algorithms to programmable network devices.
no code implementations • 17 May 2022 • Changgang Zheng, Zhaoqi Xiong, Thanh T Bui, Siim Kaupmees, Riyad Bensoussane, Antoine Bernabeu, Shay Vargaftik, Yaniv Ben-Itzhak, Noa Zilberman
In this paper, we introduce IIsy, implementing machine learning classification models in a hybrid fashion using off-the-shelf network devices.
1 code implementation • 19 Aug 2021 • Shay Vargaftik, Ran Ben Basat, Amit Portnoy, Gal Mendelson, Yaniv Ben-Itzhak, Michael Mitzenmacher
Distributed Mean Estimation (DME) is a central building block in federated learning, where clients send local gradients to a parameter server for averaging and updating the model.
1 code implementation • NeurIPS 2021 • Shay Vargaftik, Ran Ben Basat, Amit Portnoy, Gal Mendelson, Yaniv Ben-Itzhak, Michael Mitzenmacher
We consider the problem where $n$ clients transmit $d$-dimensional real-valued vectors using $d(1+o(1))$ bits each, in a manner that allows the receiver to approximately reconstruct their mean.
no code implementations • 26 Sep 2019 • Shay Vargaftik, Isaac Keslassy, Ariel Orda, Yaniv Ben-Itzhak
Decision-tree-based ensemble classification methods (DTEMs) are a prevalent tool for supervised anomaly detection.