Personalized Federated Learning

19 papers with code • 3 benchmarks • 2 datasets

The federated learning setup presents numerous challenges including data heterogeneity (differences in data distribution), device heterogeneity (in terms of computation capabilities, network connection, etc.), and communication efficiency. Especially data heterogeneity makes it hard to learn a single shared global model that applies to all clients. To overcome these issues, Personalized Federated Learning (PFL) aims to personalize the global model for each client in the federation.

Libraries

Use these libraries to find Personalized Federated Learning models and implementations
2 papers
58

Most implemented papers

Adaptive Personalized Federated Learning

MLOPTPSU/FedTorch 30 Mar 2020

Investigation of the degree of personalization in federated learning algorithms has shown that only maximizing the performance of the global model will confine the capacity of the local models to personalize.

Personalized Federated Learning with Moreau Envelopes

CharlieDinh/pFedMe NeurIPS 2020

Federated learning (FL) is a decentralized and privacy-preserving machine learning technique in which a group of clients collaborate with a server to learn a global model without sharing clients' data.

Personalized Federated Learning with Theoretical Guarantees: A Model-Agnostic Meta-Learning Approach

TsingZ0/PFL-Non-IID NeurIPS 2020

In this paper, we study a personalized variant of the federated learning in which our goal is to find an initial shared model that current or new users can easily adapt to their local dataset by performing one or a few steps of gradient descent with respect to their own data.

Ditto: Fair and Robust Federated Learning Through Personalization

litian96/ditto 8 Dec 2020

Fairness and robustness are two important concerns for federated learning systems.

Exploiting Shared Representations for Personalized Federated Learning

lgcollins/FedRep 14 Feb 2021

Based on this intuition, we propose a novel federated learning framework and algorithm for learning a shared data representation across clients and unique local heads for each client.

A New Look and Convergence Rate of Federated Multi-Task Learning with Laplacian Regularization

dual-grp/fedu_fmtl 14 Feb 2021

Non-Independent and Identically Distributed (non- IID) data distribution among clients is considered as the key factor that degrades the performance of federated learning (FL).

Intrusion Detection with Segmented Federated Learning for Large-Scale Multiple LANs

yuweisunn/segmented-FL International Joint Conference on Neural Networks (IJCNN) 2020

In this research, a segmented federated learning is proposed, different from a collaborative learning based on single global model in a traditional federated learning model, it keeps multiple global models which allow each segment of participants to conduct collaborative learning separately and rearranges the segmentation of participants dynamically as well.

Personalized Federated Learning with First Order Model Optimization

TsingZ0/PFL-Non-IID ICLR 2021

While federated learning traditionally aims to train a single global model across decentralized local datasets, one model may not always be ideal for all participating clients.

Adaptive Intrusion Detection in the Networking of Large-Scale LANs with Segmented Federated Learning

yuweisunn/segmented-FL IEEE Open Journal of the Communications Society (Conference version: IJCNN) 2020

We propose Segmented-Federated Learning (Segmented-FL), where by employing periodic local model evaluation and network segmentation, we aim to bring similar network environments to the same group.

PFL-MoE: Personalized Federated Learning Based on Mixture of Experts

guobbin/PFL-MoE 31 Dec 2020

To achieve model personalization while maintaining generalization, in this paper, we propose a new approach, named PFL-MoE, which mixes outputs of the personalized model and global model via the MoE architecture.