Search Results for author: Sean Augenstein

Found 7 papers, 3 papers with code

Learning from straggler clients in federated learning

no code implementations14 Mar 2024 Andrew Hard, Antonious M. Girgis, Ehsan Amid, Sean Augenstein, Lara McConnaughey, Rajiv Mathews, Rohan Anil

How well do existing federated learning algorithms learn from client devices that return model updates with a significant time delay?

Federated Learning

Learning to Generate Image Embeddings with User-level Differential Privacy

1 code implementation CVPR 2023 Zheng Xu, Maxwell Collins, Yuxiao Wang, Liviu Panait, Sewoong Oh, Sean Augenstein, Ting Liu, Florian Schroff, H. Brendan McMahan

Small on-device models have been successfully trained with user-level differential privacy (DP) for next word prediction and image classification tasks in the past.

Federated Learning Image Classification

Mixed Federated Learning: Joint Decentralized and Centralized Learning

no code implementations26 May 2022 Sean Augenstein, Andrew Hard, Lin Ning, Karan Singhal, Satyen Kale, Kurt Partridge, Rajiv Mathews

For example, additional datacenter data can be leveraged to jointly learn from centralized (datacenter) and decentralized (federated) training data and better match an expected inference data distribution.

Federated Learning

Jointly Learning from Decentralized (Federated) and Centralized Data to Mitigate Distribution Shift

no code implementations23 Nov 2021 Sean Augenstein, Andrew Hard, Kurt Partridge, Rajiv Mathews

With privacy as a motivation, Federated Learning (FL) is an increasingly used paradigm where learning takes place collectively on edge devices, each with a cache of user-generated training examples that remain resident on the local device.

Federated Learning

Generative Models for Effective ML on Private, Decentralized Datasets

3 code implementations ICLR 2020 Sean Augenstein, H. Brendan McMahan, Daniel Ramage, Swaroop Ramaswamy, Peter Kairouz, Mingqing Chen, Rajiv Mathews, Blaise Aguera y Arcas

To improve real-world applications of machine learning, experienced modelers develop intuition about their datasets, their models, and how the two interact.

Federated Learning

Federated Learning for Mobile Keyboard Prediction

5 code implementations8 Nov 2018 Andrew Hard, Kanishka Rao, Rajiv Mathews, Swaroop Ramaswamy, Françoise Beaufays, Sean Augenstein, Hubert Eichner, Chloé Kiddon, Daniel Ramage

We train a recurrent neural network language model using a distributed, on-device learning framework called federated learning for the purpose of next-word prediction in a virtual keyboard for smartphones.

Federated Learning Language Modelling

Cannot find the paper you are looking for? You can Submit a new open access paper.