Search Results for author: Mohammad Ali Maddah-Ali

Found 12 papers, 3 papers with code

NeRCC: Nested-Regression Coded Computing for Resilient Distributed Prediction Serving Systems

no code implementations6 Feb 2024 Parsa Moradi, Mohammad Ali Maddah-Ali

Resilience against stragglers is a critical element of prediction serving systems, tasked with executing inferences on input data for a pre-trained machine-learning model.

regression

ByzSecAgg: A Byzantine-Resistant Secure Aggregation Scheme for Federated Learning Based on Coded Computing and Vector Commitment

no code implementations20 Feb 2023 Tayyebeh Jahani-Nezhad, Mohammad Ali Maddah-Ali, Giuseppe Caire

In this paper, we propose ByzSecAgg, an efficient secure aggregation scheme for federated learning that is protected against Byzantine attacks and privacy leakages.

Federated Learning Outlier Detection

SwiftAgg+: Achieving Asymptotically Optimal Communication Loads in Secure Aggregation for Federated Learning

no code implementations24 Mar 2022 Tayyebeh Jahani-Nezhad, Mohammad Ali Maddah-Ali, Songze Li, Giuseppe Caire

We propose SwiftAgg+, a novel secure aggregation protocol for federated learning systems, where a central server aggregates local models of $N \in \mathbb{N}$ distributed users, each of size $L \in \mathbb{N}$, trained on their local data, in a privacy-preserving manner.

Federated Learning Privacy Preserving

SwiftAgg: Communication-Efficient and Dropout-Resistant Secure Aggregation for Federated Learning with Worst-Case Security Guarantees

no code implementations8 Feb 2022 Tayyebeh Jahani-Nezhad, Mohammad Ali Maddah-Ali, Songze Li, Giuseppe Caire

We propose SwiftAgg, a novel secure aggregation protocol for federated learning systems, where a central server aggregates local models of $N$ distributed users, each of size $L$, trained on their local data, in a privacy-preserving manner.

Federated Learning Privacy Preserving

Optimal Communication-Computation Trade-Off in Heterogeneous Gradient Coding

no code implementations2 Mar 2021 Tayyebeh Jahani-Nezhad, Mohammad Ali Maddah-Ali

Gradient coding allows a master node to derive the aggregate of the partial gradients, calculated by some worker nodes over the local data sets, with minimum communication cost, and in the presence of stragglers.

Berrut Approximated Coded Computing: Straggler Resistance Beyond Polynomial Computing

no code implementations17 Sep 2020 Tayyebeh Jahani-Nezhad, Mohammad Ali Maddah-Ali

In this technique, coding is used across data sets, and computation is done over coded data, such that the results of an arbitrary subset of worker nodes with a certain size are enough to recover the final results.

DRL-Based QoS-Aware Resource Allocation Scheme for Coexistence of Licensed and Unlicensed Users in LTE and Beyond

no code implementations16 Aug 2020 Mahdi Nouri Boroujerdi, Mohammad Akbari, Roghayeh Joda, Mohammad Ali Maddah-Ali, Babak Hossein Khalaj

In this paper, we employ deep reinforcement learning to develop a novel radio resource allocation and packet scheduling scheme for different Quality of Service (QoS) requirements applicable to LTEadvanced and 5G networks.

reinforcement-learning Reinforcement Learning (RL) +1

A Hybrid-Order Distributed SGD Method for Non-Convex Optimization to Balance Communication Overhead, Computational Complexity, and Convergence Rate

no code implementations27 Mar 2020 Naeimeh Omidvar, Mohammad Ali Maddah-Ali, Hamed Mahdavi

In this paper, we propose a method of distributed stochastic gradient descent (SGD), with low communication load and computational complexity, and still fast convergence.

Corella: A Private Multi Server Learning Approach based on Correlated Queries

1 code implementation26 Mar 2020 Hamidreza Ehteram, Mohammad Ali Maddah-Ali, Mahtab Mirmohseni

One of the major challenges in this setup is to guarantee the privacy of the client data.

Coded Fourier Transform

no code implementations17 Oct 2017 Qian Yu, Mohammad Ali Maddah-Ali, A. Salman Avestimehr

We consider the problem of computing the Fourier transform of high-dimensional vectors, distributedly over a cluster of machines consisting of a master node and multiple worker nodes, where the worker nodes can only store and process a fraction of the inputs.

Polynomial Codes: an Optimal Design for High-Dimensional Coded Matrix Multiplication

3 code implementations NeurIPS 2017 Qian Yu, Mohammad Ali Maddah-Ali, A. Salman Avestimehr

We consider a large-scale matrix multiplication problem where the computation is carried out using a distributed system with a master node and multiple worker nodes, where each worker can store parts of the input matrices.

Information Theory Distributed, Parallel, and Cluster Computing Information Theory

Coded TeraSort

2 code implementations16 Feb 2017 Songze Li, Sucha Supittayapornpong, Mohammad Ali Maddah-Ali, A. Salman Avestimehr

We focus on sorting, which is the building block of many machine learning algorithms, and propose a novel distributed sorting algorithm, named Coded TeraSort, which substantially improves the execution time of the TeraSort benchmark in Hadoop MapReduce.

Distributed, Parallel, and Cluster Computing Information Theory Information Theory

Cannot find the paper you are looking for? You can Submit a new open access paper.