Search Results for author: Rajarshi Saha

Found 7 papers, 2 papers with code

Matrix Compression via Randomized Low Rank and Low Precision Factorization

1 code implementation NeurIPS 2023 Rajarshi Saha, Varun Srivastava, Mert Pilanci

We propose an algorithm that exploits this structure to obtain a low rank decomposition of any matrix $\mathbf{A}$ as $\mathbf{A} \approx \mathbf{L}\mathbf{R}$, where $\mathbf{L}$ and $\mathbf{R}$ are the low rank factors.

Image Compression Quantization

Collaborative Mean Estimation over Intermittently Connected Networks with Peer-To-Peer Privacy

no code implementations28 Feb 2023 Rajarshi Saha, Mohamed Seif, Michal Yemini, Andrea J. Goldsmith, H. Vincent Poor

This work considers the problem of Distributed Mean Estimation (DME) over networks with intermittent connectivity, where the goal is to learn a global statistic over the data samples localized across distributed nodes with the help of a central server.

Semi-Decentralized Federated Learning with Collaborative Relaying

no code implementations23 May 2022 Michal Yemini, Rajarshi Saha, Emre Ozfatura, Deniz Gündüz, Andrea J. Goldsmith

We present a semi-decentralized federated learning algorithm wherein clients collaborate by relaying their neighbors' local updates to a central parameter server (PS).

Federated Learning

Minimax Optimal Quantization of Linear Models: Information-Theoretic Limits and Efficient Algorithms

no code implementations23 Feb 2022 Rajarshi Saha, Mert Pilanci, Andrea J. Goldsmith

We derive an information-theoretic lower bound for the minimax risk under this setting and propose a matching upper bound using randomized embedding-based algorithms which is tight up to constant factors.

Quantization

Partner-Aware Algorithms in Decentralized Cooperative Bandit Teams

no code implementations2 Oct 2021 Erdem Biyik, Anusha Lalitha, Rajarshi Saha, Andrea Goldsmith, Dorsa Sadigh

Our results show that the proposed partner-aware strategy outperforms other known methods, and our human subject studies suggest humans prefer to collaborate with AI agents implementing our partner-aware strategy.

Decision Making

Efficient Randomized Subspace Embeddings for Distributed Optimization under a Communication Budget

1 code implementation13 Mar 2021 Rajarshi Saha, Mert Pilanci, Andrea J. Goldsmith

As a consequence, quantizing these embeddings followed by an inverse transform to the original space yields a source coding method with optimal covering efficiency while utilizing just $R$-bits per dimension.

Distributed Optimization Quantization

Cannot find the paper you are looking for? You can Submit a new open access paper.