Search Results for author: Pierre Tholoniat

Found 4 papers, 2 papers with code

Differentially Private Training of Mixture of Experts Models

no code implementations11 Feb 2024 Pierre Tholoniat, Huseyin A. Inan, Janardhan Kulkarni, Robert Sim

This position paper investigates the integration of Differential Privacy (DP) in the training of Mixture of Experts (MoE) models within the field of natural language processing.

Computational Efficiency Privacy Preserving

Packing Privacy Budget Efficiently

no code implementations26 Dec 2022 Pierre Tholoniat, Kelly Kostopoulou, Mosharaf Chowdhury, Asaf Cidon, Roxana Geambasu, Mathias Lécuyer, Junfeng Yang

This DP budget can be regarded as a new type of compute resource in workloads of multiple ML models training on user data.

Fairness Scheduling

Privacy Budget Scheduling

1 code implementation29 Jun 2021 Tao Luo, Mingen Pan, Pierre Tholoniat, Asaf Cidon, Roxana Geambasu, Mathias Lécuyer

We describe PrivateKube, an extension to the popular Kubernetes datacenter orchestrator that adds privacy as a new type of resource to be managed alongside other traditional compute resources, such as CPU, GPU, and memory.

Fairness Scheduling

ARIANN: Low-Interaction Privacy-Preserving Deep Learning via Function Secret Sharing

2 code implementations8 Jun 2020 Théo Ryffel, Pierre Tholoniat, David Pointcheval, Francis Bach

We evaluate our end-to-end system for private inference between distant servers on standard neural networks such as AlexNet, VGG16 or ResNet18, and for private training on smaller networks like LeNet.

Federated Learning Privacy Preserving +1

Cannot find the paper you are looking for? You can Submit a new open access paper.