Search Results for author: Rahif Kassab

Found 3 papers, 1 papers with code

Client Selection for Federated Bayesian Learning

no code implementations11 Dec 2022 Jiarong Yang, YuAn Liu, Rahif Kassab

Distributed Stein Variational Gradient Descent (DSVGD) is a non-parametric distributed learning framework for federated Bayesian learning, where multiple clients jointly train a machine learning model by communicating a number of non-random and interacting particles with the server.

Forget-SVGD: Particle-Based Bayesian Federated Unlearning

no code implementations23 Nov 2021 Jinu Gong, Osvaldo Simeone, Rahif Kassab, Joonhyuk Kang

Variational particle-based Bayesian learning methods have the advantage of not being limited by the bias affecting more conventional parametric techniques.

Bayesian Inference Federated Learning

Federated Generalized Bayesian Learning via Distributed Stein Variational Gradient Descent

1 code implementation11 Sep 2020 Rahif Kassab, Osvaldo Simeone

This paper introduces Distributed Stein Variational Gradient Descent (DSVGD), a non-parametric generalized Bayesian inference framework for federated learning.

Bayesian Inference Federated Learning +1

Cannot find the paper you are looking for? You can Submit a new open access paper.