no code implementations • 3 Feb 2023 • Hilal Asi, Jonathan Ullman, Lydia Zakynthinou
Thus, we conclude that for any low-dimensional task, the optimal error rate for $\varepsilon$-differentially private estimators is essentially the same as the optimal error rate for estimators that are robust to adversarially corrupting $1/\varepsilon$ training samples.
no code implementations • 7 Sep 2022 • Konstantina Bairaktari, Guy Blanc, Li-Yang Tan, Jonathan Ullman, Lydia Zakynthinou
We investigate the computational efficiency of multitask learning of Boolean functions over the $d$-dimensional hypercube, that are related by means of a feature representation of size $k \ll d$ shared across all tasks.
no code implementations • NeurIPS 2021 • Gavin Brown, Marco Gaboardi, Adam Smith, Jonathan Ullman, Lydia Zakynthinou
Each of our estimators is based on a simple, general approach to designing differentially private mechanisms, but with novel technical steps to make the estimator private and sample-efficient.
no code implementations • 17 Jun 2021 • Peter Grünwald, Thomas Steinke, Lydia Zakynthinou
We give a novel, unified derivation of conditional PAC-Bayesian and mutual information (MI) generalization bounds.
no code implementations • 29 May 2020 • Anamay Chaturvedi, Huy Nguyen, Lydia Zakynthinou
We extend this work by designing differentially private algorithms for both monotone and non-monotone decomposable submodular maximization under general matroid constraints, with competitive utility guarantees.
no code implementations • 24 Jan 2020 • Thomas Steinke, Lydia Zakynthinou
We provide an information-theoretic framework for studying the generalization properties of machine learning algorithms.
no code implementations • NeurIPS 2020 • Clément L. Canonne, Gautam Kamath, Audra McMillan, Jonathan Ullman, Lydia Zakynthinou
In this work we present novel differentially private identity (goodness-of-fit) testers for natural and widely studied classes of multivariate product distributions: Gaussians in $\mathbb{R}^d$ with known covariance and product distributions over $\{\pm 1\}^{d}$.
no code implementations • 24 Feb 2019 • Huy L. Nguyen, Jonathan Ullman, Lydia Zakynthinou
We present new differentially private algorithms for learning a large-margin halfspace.
no code implementations • NeurIPS 2018 • Huy L. Nguyen, Lydia Zakynthinou
We study a recent model of collaborative PAC learning where $k$ players with $k$ different tasks collaborate to learn a single classifier that works for all tasks.