Search Results for author: Amirhossein Reisizadeh

Found 11 papers, 3 papers with code

EM for Mixture of Linear Regression with Clustered Data

no code implementations22 Aug 2023 Amirhossein Reisizadeh, Khashayar Gatmiry, Asuman Ozdaglar

In many settings however, heterogeneous data may be generated in clusters with shared structures, as is the case in several applications such as federated learning where a common latent variable governs the distribution of all the samples generated by a client.

Federated Learning regression

Variance-reduced Clipping for Non-convex Optimization

1 code implementation2 Mar 2023 Amirhossein Reisizadeh, Haochuan Li, Subhro Das, Ali Jadbabaie

This is in clear contrast to the well-established assumption in folklore non-convex optimization, a. k. a.

Language Modelling

Gradient Descent for Low-Rank Functions

no code implementations16 Jun 2022 Romain Cosson, Ali Jadbabaie, Anuran Makur, Amirhossein Reisizadeh, Devavrat Shah

When $r \ll p$, these complexities are smaller than the known complexities of $\mathcal{O}(p \log(1/\epsilon))$ and $\mathcal{O}(p/\epsilon^2)$ of {\gd} in the strongly convex and non-convex settings, respectively.

An Optimal Transport Approach to Personalized Federated Learning

no code implementations6 Jun 2022 Farzan Farnia, Amirhossein Reisizadeh, Ramtin Pedarsani, Ali Jadbabaie

In this paper, we focus on this problem and propose a novel personalized Federated Learning scheme based on Optimal Transport (FedOT) as a learning algorithm that learns the optimal transport maps for transferring data points to a common distribution as well as the prediction model under the applied transport map.

Personalized Federated Learning

Straggler-Resilient Federated Learning: Leveraging the Interplay Between Statistical Accuracy and System Heterogeneity

no code implementations28 Dec 2020 Amirhossein Reisizadeh, Isidoros Tziotis, Hamed Hassani, Aryan Mokhtari, Ramtin Pedarsani

Federated Learning is a novel paradigm that involves learning from data samples distributed across a large network of clients while the data remains local.

Federated Learning

Robust Federated Learning: The Case of Affine Distribution Shifts

no code implementations NeurIPS 2020 Amirhossein Reisizadeh, Farzan Farnia, Ramtin Pedarsani, Ali Jadbabaie

In such settings, the training data is often statistically heterogeneous and manifests various distribution shifts across users, which degrades the performance of the learnt model.

Federated Learning Image Classification

Robust and Communication-Efficient Collaborative Learning

1 code implementation NeurIPS 2019 Amirhossein Reisizadeh, Hossein Taheri, Aryan Mokhtari, Hamed Hassani, Ramtin Pedarsani

We consider a decentralized learning problem, where a set of computing nodes aim at solving a non-convex optimization problem collaboratively.

Quantization

CodedReduce: A Fast and Robust Framework for Gradient Aggregation in Distributed Learning

no code implementations6 Feb 2019 Amirhossein Reisizadeh, Saurav Prakash, Ramtin Pedarsani, Amir Salman Avestimehr

That is, it parallelizes the communications over a tree topology leading to efficient bandwidth utilization, and carefully designs a redundant data set allocation and coding strategy at the nodes to make the proposed gradient aggregation scheme robust to stragglers.

An Exact Quantized Decentralized Gradient Descent Algorithm

no code implementations29 Jun 2018 Amirhossein Reisizadeh, Aryan Mokhtari, Hamed Hassani, Ramtin Pedarsani

We consider the problem of decentralized consensus optimization, where the sum of $n$ smooth and strongly convex functions are minimized over $n$ distributed agents that form a connected network.

Distributed Optimization Quantization

Coded Computation over Heterogeneous Clusters

1 code implementation21 Jan 2017 Amirhossein Reisizadeh, Saurav Prakash, Ramtin Pedarsani, Amir Salman Avestimehr

There have been recent results that demonstrate the impact of coding for efficient utilization of computation and storage redundancy to alleviate the effect of stragglers and communication bottlenecks in homogeneous clusters.

Distributed, Parallel, and Cluster Computing Information Theory Information Theory

Cannot find the paper you are looking for? You can Submit a new open access paper.