Personalized Federated Learning using Hypernetworks

8 Mar 2021  ·  Aviv Shamsian, Aviv Navon, Ethan Fetaya, Gal Chechik ·

Personalized federated learning is tasked with training machine learning models for multiple clients, each with its own data distribution. The goal is to train personalized models in a collaborative way while accounting for data disparities across clients and reducing communication costs. We propose a novel approach to this problem using hypernetworks, termed pFedHN for personalized Federated HyperNetworks. In this approach, a central hypernetwork model is trained to generate a set of models, one model for each client. This architecture provides effective parameter sharing across clients, while maintaining the capacity to generate unique and diverse personal models. Furthermore, since hypernetwork parameters are never transmitted, this approach decouples the communication cost from the trainable model size. We test pFedHN empirically in several personalized federated learning challenges and find that it outperforms previous methods. Finally, since hypernetworks share information across clients we show that pFedHN can generalize better to new clients whose distributions differ from any client observed during training.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Personalized Federated Learning CIFAR-10 pFedHN ACC@1-10Clients 90.83 # 2
ACC@1-50Clients 88.38 # 1
ACC@1-100Clients 87.97 # 4
Personalized Federated Learning CIFAR-10 pFedHN-PC ACC@1-10Clients 92.47 # 1
ACC@1-50Clients 90.08 # 5
ACC@1-100Clients 88.09 # 3
ACC@1-500 83.2 # 4
Personalized Federated Learning CIFAR-100 pFedHN ACC@1-10Clients 65.74 # 2
ACC@1-50Clients 59.46 # 5
ACC@1-100Clients 53.24 # 4
Personalized Federated Learning CIFAR-100 pFedHN-PC ACC@1-10Clients 68.15 # 1
ACC@1-50Clients 60.17 # 4
ACC@1-100Clients 52.40 # 5
ACC@1-500 34.1 # 4
Personalized Federated Learning Omniglot pFedHN ACC@1-50Clients 72.03 # 2
Personalized Federated Learning Omniglot pFedHN-PC ACC@1-50Clients 81.89 # 1