Search Results for author: Zachary Garrett

Found 10 papers, 6 papers with code

Adaptive Federated Optimization

5 code implementations ICLR 2021 Sashank Reddi, Zachary Charles, Manzil Zaheer, Zachary Garrett, Keith Rush, Jakub Konečný, Sanjiv Kumar, H. Brendan McMahan

Federated learning is a distributed machine learning paradigm in which a large number of clients coordinate with a central server to learn a model without sharing their own training data.

Federated Learning

Federated Reconstruction: Partially Local Federated Learning

3 code implementations NeurIPS 2021 Karan Singhal, Hakim Sidahmed, Zachary Garrett, Shanshan Wu, Keith Rush, Sushant Prakash

We also describe the successful deployment of this approach at scale for federated collaborative filtering in a mobile keyboard application.

Collaborative Filtering Federated Learning +1

Local Adaptivity in Federated Learning: Convergence and Consistency

no code implementations4 Jun 2021 Jianyu Wang, Zheng Xu, Zachary Garrett, Zachary Charles, Luyang Liu, Gauri Joshi

Popular optimization algorithms of FL use vanilla (stochastic) gradient descent for both local updates at clients and global updates at the aggregating server.

Federated Learning

Federated Automatic Differentiation

no code implementations18 Jan 2023 Keith Rush, Zachary Charles, Zachary Garrett

We propose a federated automatic differentiation (FAD) framework that 1) enables computing derivatives of functions involving client and server computation as well as communication between them and 2) operates in a manner compatible with existing federated technology.

FAD Federated Learning +1

Leveraging Function Space Aggregation for Federated Learning at Scale

no code implementations17 Nov 2023 Nikita Dhawan, Nicole Mitchell, Zachary Charles, Zachary Garrett, Gintare Karolina Dziugaite

Many federated learning algorithms, including the canonical Federated Averaging (FedAvg), take a direct (possibly weighted) average of the client parameter updates, motivated by results in distributed optimization.

Distributed Optimization Federated Learning

FAX: Scalable and Differentiable Federated Primitives in JAX

1 code implementation11 Mar 2024 Keith Rush, Zachary Charles, Zachary Garrett

We show that FAX provides an easily programmable, performant, and scalable framework for federated computations in the data center.

Cannot find the paper you are looking for? You can Submit a new open access paper.