no code implementations • 11 Dec 2022 • Jiarong Yang, YuAn Liu, Rahif Kassab
Distributed Stein Variational Gradient Descent (DSVGD) is a non-parametric distributed learning framework for federated Bayesian learning, where multiple clients jointly train a machine learning model by communicating a number of non-random and interacting particles with the server.
no code implementations • 23 Nov 2021 • Jinu Gong, Osvaldo Simeone, Rahif Kassab, Joonhyuk Kang
Variational particle-based Bayesian learning methods have the advantage of not being limited by the bias affecting more conventional parametric techniques.
1 code implementation • 11 Sep 2020 • Rahif Kassab, Osvaldo Simeone
This paper introduces Distributed Stein Variational Gradient Descent (DSVGD), a non-parametric generalized Bayesian inference framework for federated learning.