Search Results for author: Vincent Plassier

Found 7 papers, 0 papers with code

Conformal Prediction for Federated Uncertainty Quantification Under Label Shift

no code implementations8 Jun 2023 Vincent Plassier, Mehdi Makni, Aleksandr Rubashevskii, Eric Moulines, Maxim Panov

Federated Learning (FL) is a machine learning framework where many clients collaboratively train models while keeping the training data decentralized.

Conformal Prediction Federated Learning +2

Membership Inference Attacks via Adversarial Examples

no code implementations27 Jul 2022 Hamid Jalalzai, Elie Kadoche, Rémi Leluc, Vincent Plassier

In this paper, we develop a mean to measure the leakage of training data leveraging a quantity appearing as a proxy of the total variation of a trained model near its training samples.

DG-LMC: A Turn-key and Scalable Synchronous Distributed MCMC Algorithm via Langevin Monte Carlo within Gibbs

no code implementations11 Jun 2021 Vincent Plassier, Maxime Vono, Alain Durmus, Eric Moulines

Performing reliable Bayesian inference on a big data scale is becoming a keystone in the modern era of machine learning.

Bayesian Inference

QLSD: Quantised Langevin stochastic dynamics for Bayesian federated learning

no code implementations1 Jun 2021 Maxime Vono, Vincent Plassier, Alain Durmus, Aymeric Dieuleveut, Eric Moulines

The objective of Federated Learning (FL) is to perform statistical inference for data which are decentralised and stored locally on networked clients.

Federated Learning

Risk bounds when learning infinitely many response functions by ordinary linear regression

no code implementations16 Jun 2020 Vincent Plassier, François Portier, Johan Segers

Consider the problem of learning a large number of response functions simultaneously based on the same input variables.

regression

Cannot find the paper you are looking for? You can Submit a new open access paper.