Search Results for author: Keith Rush

Found 10 papers, 6 papers with code

FAX: Scalable and Differentiable Federated Primitives in JAX

1 code implementation11 Mar 2024 Keith Rush, Zachary Charles, Zachary Garrett

We show that FAX provides an easily programmable, performant, and scalable framework for federated computations in the data center.

Federated Automatic Differentiation

no code implementations18 Jan 2023 Keith Rush, Zachary Charles, Zachary Garrett

We propose a federated automatic differentiation (FAD) framework that 1) enables computing derivatives of functions involving client and server computation as well as communication between them and 2) operates in a manner compatible with existing federated technology.

FAD Federated Learning +1

Multi-Epoch Matrix Factorization Mechanisms for Private Machine Learning

1 code implementation12 Nov 2022 Christopher A. Choquette-Choo, H. Brendan McMahan, Keith Rush, Abhradeep Thakurta

We formalize the problem of DP mechanisms for adaptive streams with multiple participations and introduce a non-trivial extension of online matrix factorization DP mechanisms to our setting.

Image Classification Language Modelling

Improved Differential Privacy for SGD via Optimal Private Linear Operators on Adaptive Streams

1 code implementation16 Feb 2022 Sergey Denisov, Brendan Mcmahan, Keith Rush, Adam Smith, Abhradeep Guha Thakurta

Motivated by recent applications requiring differential privacy over adaptive streams, we investigate the question of optimal instantiations of the matrix mechanism in this setting.

Federated Learning

Iterated Vector Fields and Conservatism, with Applications to Federated Learning

no code implementations8 Sep 2021 Zachary Charles, Keith Rush

In the context of federated learning, we show that when clients have loss functions whose gradients satisfy this condition, federated averaging is equivalent to gradient descent on a surrogate loss function.

Federated Learning

Federated Reconstruction: Partially Local Federated Learning

3 code implementations NeurIPS 2021 Karan Singhal, Hakim Sidahmed, Zachary Garrett, Shanshan Wu, Keith Rush, Sushant Prakash

We also describe the successful deployment of this approach at scale for federated collaborative filtering in a mobile keyboard application.

Collaborative Filtering Federated Learning +1

Fast Dimension Independent Private AdaGrad on Publicly Estimated Subspaces

no code implementations14 Aug 2020 Peter Kairouz, Mónica Ribero, Keith Rush, Abhradeep Thakurta

In particular, we show that if the gradients lie in a known constant rank subspace, and assuming algorithmic access to an envelope which bounds decaying sensitivity, one can achieve faster convergence to an excess empirical risk of $\tilde O(1/\epsilon n)$, where $\epsilon$ is the privacy budget and $n$ the number of samples.

Adaptive Federated Optimization

5 code implementations ICLR 2021 Sashank Reddi, Zachary Charles, Manzil Zaheer, Zachary Garrett, Keith Rush, Jakub Konečný, Sanjiv Kumar, H. Brendan McMahan

Federated learning is a distributed machine learning paradigm in which a large number of clients coordinate with a central server to learn a model without sharing their own training data.

Federated Learning

Improving Federated Learning Personalization via Model Agnostic Meta Learning

2 code implementations27 Sep 2019 Yihan Jiang, Jakub Konečný, Keith Rush, Sreeram Kannan

We present FL as a natural source of practical applications for MAML algorithms, and make the following observations.

Federated Learning Meta-Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.