1 code implementation • 16 Feb 2023 • Minghao Li, Ran Ben Basat, Shay Vargaftik, ChonLam Lao, Kevin Xu, Michael Mitzenmacher, Minlan Yu
To address this bottleneck and accelerate training, a widely-deployed approach is compression.
no code implementations • 26 May 2022 • Ran Ben Basat, Shay Vargaftik, Amit Portnoy, Gil Einziger, Yaniv Ben-Itzhak, Michael Mitzenmacher
Distributed Mean Estimation (DME), in which $n$ clients communicate vectors to a parameter server that estimates their average, is a fundamental building block in communication-efficient federated learning.
1 code implementation • 19 Aug 2021 • Shay Vargaftik, Ran Ben Basat, Amit Portnoy, Gal Mendelson, Yaniv Ben-Itzhak, Michael Mitzenmacher
Distributed Mean Estimation (DME) is a central building block in federated learning, where clients send local gradients to a parameter server for averaging and updating the model.
1 code implementation • NeurIPS 2021 • Shay Vargaftik, Ran Ben Basat, Amit Portnoy, Gal Mendelson, Yaniv Ben-Itzhak, Michael Mitzenmacher
We consider the problem where $n$ clients transmit $d$-dimensional real-valued vectors using $d(1+o(1))$ bits each, in a manner that allows the receiver to approximately reconstruct their mean.
1 code implementation • 10 Apr 2020 • Muhammad Tirmazi, Ran Ben Basat, Jiaqi Gao, Minlan Yu
In this paper, we leverage programmable switches in the network to partially offload query computation to the switch.
Databases Networking and Internet Architecture
no code implementations • 24 Apr 2018 • Ran Ben Basat, Maayan Goldstein, Itai Segall
Modern software systems are expected to be secure and contain all the latest features, even when new versions of software are released multiple times an hour.