Search Results for author: Amit Portnoy

Found 5 papers, 3 papers with code

QUIC-FL: Quick Unbiased Compression for Federated Learning

no code implementations26 May 2022 Ran Ben Basat, Shay Vargaftik, Amit Portnoy, Gil Einziger, Yaniv Ben-Itzhak, Michael Mitzenmacher

Distributed Mean Estimation (DME), in which $n$ clients communicate vectors to a parameter server that estimates their average, is a fundamental building block in communication-efficient federated learning.

Federated Learning Quantization

EDEN: Communication-Efficient and Robust Distributed Mean Estimation for Federated Learning

1 code implementation19 Aug 2021 Shay Vargaftik, Ran Ben Basat, Amit Portnoy, Gal Mendelson, Yaniv Ben-Itzhak, Michael Mitzenmacher

Distributed Mean Estimation (DME) is a central building block in federated learning, where clients send local gradients to a parameter server for averaging and updating the model.

Federated Learning

DRIVE: One-bit Distributed Mean Estimation

1 code implementation NeurIPS 2021 Shay Vargaftik, Ran Ben Basat, Amit Portnoy, Gal Mendelson, Yaniv Ben-Itzhak, Michael Mitzenmacher

We consider the problem where $n$ clients transmit $d$-dimensional real-valued vectors using $d(1+o(1))$ bits each, in a manner that allows the receiver to approximately reconstruct their mean.

Federated Learning

Towards Federated Learning With Byzantine-Robust Client Weighting

1 code implementation10 Apr 2020 Amit Portnoy, Yoav Tirosh, Danny Hendler

Federated Learning(FL) is a distributed machine learning paradigm where data is distributed among clients who collaboratively train a model in a computation process coordinated by a central server.

Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.