1 code implementation • 12 Mar 2024 • Miguel Fuentes, Brett Mullins, Ryan McKenna, Gerome Miklau, Daniel Sheldon
This technique allows for public data to be included in a graphical-model-based mechanism.
1 code implementation • NeurIPS 2021 • Ryan McKenna, Siddhant Pradhan, Daniel Sheldon, Gerome Miklau
Private-PGM is a recent approach that uses graphical models to represent the data distribution, with complexity proportional to that of exact marginal inference in a graphical model with structure determined by the co-occurrence of variables in the noisy measurements.
1 code implementation • 29 May 2019 • Satya Kuppam, Ryan McKenna, David Pujol, Michael Hay, Ashwin Machanavajjhala, Gerome Miklau
Data collected about individuals is regularly used to make decisions that impact those same individuals.
Databases
no code implementations • 1 Feb 2019 • Tengyang Xie, Philip S. Thomas, Gerome Miklau
Many reinforcement learning applications involve the use of data that is sensitive, such as medical records of patients or financial information.
4 code implementations • 26 Jan 2019 • Ryan McKenna, Daniel Sheldon, Gerome Miklau
Many privacy mechanisms reveal high-level information about a data distribution through noisy measurements.
no code implementations • ICML 2017 • Garrett Bernstein, Ryan McKenna, Tao Sun, Daniel Sheldon, Michael Hay, Gerome Miklau
A naive learning algorithm that uses the noisy sufficient statistics “as is” outperforms general-purpose differentially private learning algorithms.
no code implementations • 14 Jun 2017 • Garrett Bernstein, Ryan McKenna, Tao Sun, Daniel Sheldon, Michael Hay, Gerome Miklau
We investigate the problem of learning discrete, undirected graphical models in a differentially private way.
1 code implementation • 15 Dec 2015 • Michael Hay, Ashwin Machanavajjhala, Gerome Miklau, Yan Chen, Dan Zhang
Differential privacy has become the dominant standard in the research community for strong privacy protection.
Databases Cryptography and Security