no code implementations • 5 Jul 2022 • Kaan Ozkara, Antonious M. Girgis, Deepesh Data, Suhas Diggavi
In this work, we begin with a generative framework that could potentially unify several different algorithms as well as suggest new algorithms.
no code implementations • NeurIPS 2021 • Kaan Ozkara, Navjot Singh, Deepesh Data, Suhas Diggavi
In this work, we introduce a \textit{quantized} and \textit{personalized} FL algorithm QuPeD that facilitates collective (personalized model compression) training via \textit{knowledge distillation} (KD) among clients who have access to heterogeneous data and resources.
no code implementations • 23 Feb 2021 • Kaan Ozkara, Navjot Singh, Deepesh Data, Suhas Diggavi
When each client participating in the (federated) learning process has different requirements of the quantized model (both in value and precision), we formulate a quantized personalization framework by introducing a penalty term for local client objectives against a globally trained model to encourage collaboration.