no code implementations • 9 Feb 2024 • Antti Koskela, Rachel Redberg, Yu-Xiang Wang
Private selection mechanisms (e. g., Report Noisy Max, Sparse Vector) are fundamental primitives of differentially private (DP) data analysis with wide applications to private query release, voting, and hyperparameter tuning.
no code implementations • NeurIPS 2023 • Rachel Redberg, Antti Koskela, Yu-Xiang Wang
In the arena of privacy-preserving machine learning, differentially private stochastic gradient descent (DP-SGD) has outstripped the objective perturbation mechanism in popularity and interest.
no code implementations • 23 Oct 2023 • Yingyu Lin, Yian Ma, Yu-Xiang Wang, Rachel Redberg
Posterior sampling, i. e., exponential mechanism to sample from the posterior distribution, provides $\varepsilon$-pure differential privacy (DP) guarantees and does not suffer from potentially unbounded privacy breach introduced by $(\varepsilon,\delta)$-approximate DP.
no code implementations • 31 Dec 2022 • Rachel Redberg, Yuqing Zhu, Yu-Xiang Wang
The ''Propose-Test-Release'' (PTR) framework is a classic recipe for designing differentially private (DP) algorithms that are data-adaptive, i. e. those that add less noise when the input dataset is nice.
no code implementations • NeurIPS 2021 • Rachel Redberg, Yu-Xiang Wang
We consider how to privately share the personalized privacy losses incurred by objective perturbation, using per-instance differential privacy (pDP).
1 code implementation • 23 Feb 2020 • Wei Ye, Zhen Wang, Rachel Redberg, Ambuj Singh
At the heart of Tree++ is a graph kernel called the path-pattern graph kernel.