Search Results for author: Franklyn Wang

Found 5 papers, 2 papers with code

Intrinsic Gradient Compression for Scalable and Efficient Federated Learning

no code implementations FL4NLP (ACL) 2022 Luke Melas-Kyriazi, Franklyn Wang

Federated learning is a rapidly growing area of research, holding the promise of privacy-preserving distributed training on edge devices.

Federated Learning Learning Theory +1

Intrinisic Gradient Compression for Federated Learning

no code implementations5 Dec 2021 Luke Melas-Kyriazi, Franklyn Wang

Federated learning is a rapidly-growing area of research which enables a large number of clients to jointly train a machine learning model on privately-held data.

BIG-bench Machine Learning Federated Learning +1

Recommending with Recommendations

no code implementations2 Dec 2021 Naveen Durvasula, Franklyn Wang, Scott Duke Kominers

In our setting, the user's (potentially sensitive) information belongs to a high-dimensional latent space, and the ideal recommendations for the source and target tasks (which are non-sensitive) are given by unknown linear transformations of the user information.

Recommendation Systems

SubseasonalClimateUSA: A Dataset for Subseasonal Forecasting and Benchmarking

2 code implementations NeurIPS 2023 Soukayna Mouatadid, Paulo Orenstein, Genevieve Flaspohler, Miruna Oprescu, Judah Cohen, Franklyn Wang, Sean Knight, Maria Geogdzhayeva, Sam Levang, Ernest Fraenkel, Lester Mackey

To streamline this process and accelerate future development, we introduce SubseasonalClimateUSA, a curated dataset for training and benchmarking subseasonal forecasting models in the United States.

Benchmarking

Generalization by Recognizing Confusion

1 code implementation13 Jun 2020 Daniel Chiu, Franklyn Wang, Scott Duke Kominers

A recently-proposed technique called self-adaptive training augments modern neural networks by allowing them to adjust training labels on the fly, to avoid overfitting to samples that may be mislabeled or otherwise non-representative.

Cannot find the paper you are looking for? You can Submit a new open access paper.