no code implementations • 26 May 2022 • Ran Ben Basat, Shay Vargaftik, Amit Portnoy, Gil Einziger, Yaniv Ben-Itzhak, Michael Mitzenmacher
Distributed Mean Estimation (DME), in which $n$ clients communicate vectors to a parameter server that estimates their average, is a fundamental building block in communication-efficient federated learning.
no code implementations • ACL 2022 • Nachshon Cohen, Amit Portnoy, Besnik Fetahu, Amir Ingber
BERT based ranking models have achieved superior performance on various information retrieval tasks.
1 code implementation • 19 Aug 2021 • Shay Vargaftik, Ran Ben Basat, Amit Portnoy, Gal Mendelson, Yaniv Ben-Itzhak, Michael Mitzenmacher
Distributed Mean Estimation (DME) is a central building block in federated learning, where clients send local gradients to a parameter server for averaging and updating the model.
1 code implementation • NeurIPS 2021 • Shay Vargaftik, Ran Ben Basat, Amit Portnoy, Gal Mendelson, Yaniv Ben-Itzhak, Michael Mitzenmacher
We consider the problem where $n$ clients transmit $d$-dimensional real-valued vectors using $d(1+o(1))$ bits each, in a manner that allows the receiver to approximately reconstruct their mean.
1 code implementation • 10 Apr 2020 • Amit Portnoy, Yoav Tirosh, Danny Hendler
Federated Learning(FL) is a distributed machine learning paradigm where data is distributed among clients who collaboratively train a model in a computation process coordinated by a central server.