Search Results for author: Shay Vargaftik

Found 12 papers, 5 papers with code

Optimal and Near-Optimal Adaptive Vector Quantization

no code implementations5 Feb 2024 Ran Ben-Basat, Yaniv Ben-Itzhak, Michael Mitzenmacher, Shay Vargaftik

Quantization is a fundamental optimization for many machine-learning use cases, including compressing gradients, model weights and activations, and datasets.

Quantization

DoCoFL: Downlink Compression for Cross-Device Federated Learning

no code implementations1 Feb 2023 Ron Dorfman, Shay Vargaftik, Yaniv Ben-Itzhak, Kfir Y. Levy

Many compression techniques have been proposed to reduce the communication overhead of Federated Learning training procedures.

Federated Learning

ScionFL: Efficient and Robust Secure Quantized Aggregation

no code implementations13 Oct 2022 Yaniv Ben-Itzhak, Helen Möllering, Benny Pinkas, Thomas Schneider, Ajith Suresh, Oleksandr Tkachenko, Shay Vargaftik, Christian Weinert, Hossein Yalame, Avishay Yanai

In this paper, we unite both research directions by introducing ScionFL, the first secure aggregation framework for FL that operates efficiently on quantized inputs and simultaneously provides robustness against malicious clients.

Federated Learning Quantization

QUIC-FL: Quick Unbiased Compression for Federated Learning

no code implementations26 May 2022 Ran Ben Basat, Shay Vargaftik, Amit Portnoy, Gil Einziger, Yaniv Ben-Itzhak, Michael Mitzenmacher

Distributed Mean Estimation (DME), in which $n$ clients communicate vectors to a parameter server that estimates their average, is a fundamental building block in communication-efficient federated learning.

Federated Learning Quantization

Automating In-Network Machine Learning

1 code implementation18 May 2022 Changgang Zheng, Mingyuan Zang, Xinpeng Hong, Riyad Bensoussane, Shay Vargaftik, Yaniv Ben-Itzhak, Noa Zilberman

To date, no general solution has been provided for mapping machine learning algorithms to programmable network devices.

Anomaly Detection BIG-bench Machine Learning

IIsy: Practical In-Network Classification

no code implementations17 May 2022 Changgang Zheng, Zhaoqi Xiong, Thanh T Bui, Siim Kaupmees, Riyad Bensoussane, Antoine Bernabeu, Shay Vargaftik, Yaniv Ben-Itzhak, Noa Zilberman

In this paper, we introduce IIsy, implementing machine learning classification models in a hybrid fashion using off-the-shelf network devices.

BIG-bench Machine Learning Classification

EDEN: Communication-Efficient and Robust Distributed Mean Estimation for Federated Learning

1 code implementation19 Aug 2021 Shay Vargaftik, Ran Ben Basat, Amit Portnoy, Gal Mendelson, Yaniv Ben-Itzhak, Michael Mitzenmacher

Distributed Mean Estimation (DME) is a central building block in federated learning, where clients send local gradients to a parameter server for averaging and updating the model.

Federated Learning

DRIVE: One-bit Distributed Mean Estimation

1 code implementation NeurIPS 2021 Shay Vargaftik, Ran Ben Basat, Amit Portnoy, Gal Mendelson, Yaniv Ben-Itzhak, Michael Mitzenmacher

We consider the problem where $n$ clients transmit $d$-dimensional real-valued vectors using $d(1+o(1))$ bits each, in a manner that allows the receiver to approximately reconstruct their mean.

Federated Learning

How to send a real number using a single bit (and some shared randomness)

no code implementations5 Oct 2020 Ran Ben-Basat, Michael Mitzenmacher, Shay Vargaftik

We consider both the biased and unbiased estimation problems and aim to minimize the cost.

AnchorHash: A Scalable Consistent Hash

2 code implementations23 Dec 2018 Gal Mendelson, Shay Vargaftik, Katherine Barabash, Dean Lorenz, Isaac Keslassy, Ariel Orda

Consistent hashing (CH) is a central building block in many networking applications, from datacenter load-balancing to distributed storage.

Data Structures and Algorithms

Cannot find the paper you are looking for? You can Submit a new open access paper.