no code implementations • 21 Dec 2023 • Maryam Aliakbarpour, Konstantina Bairaktari, Gavin Brown, Adam Smith, Nathan Srebro, Jonathan Ullman
In multitask learning, we are given a fixed set of related learning tasks and need to output one accurate model per task, whereas in metalearning we are given tasks that are drawn i. i. d.
no code implementations • 22 May 2023 • Maryam Aliakbarpour, Rose Silver, Thomas Steinke, Jonathan Ullman
We construct differentially private estimators with low sample complexity that estimate the median of an arbitrary distribution over $\mathbb{R}$ satisfying very mild moment conditions.
no code implementations • 19 May 2022 • Maryam Aliakbarpour, Andrew Mcgregor, Jelani Nelson, Erik Waingarten
Recent work of Acharya et al. (NeurIPS 2019) showed how to estimate the entropy of a distribution $\mathcal D$ over an alphabet of size $k$ up to $\pm\epsilon$ additive error by streaming over $(k/\epsilon^3) \cdot \text{polylog}(1/\epsilon)$ i. i. d.
no code implementations • 2 Feb 2021 • Shahab Asoodeh, Maryam Aliakbarpour, Flavio P. Calmon
We investigate the local differential privacy (LDP) guarantees of a randomized privacy mechanism via its contraction properties.
no code implementations • 6 Oct 2020 • Maryam Aliakbarpour, Amartya Shankha Biswas, Kavya Ravichandran, Ronitt Rubinfeld
Understanding the shape of a distribution of data is of interest to people in a great variety of fields, as it may affect the types of algorithms used for that data.
no code implementations • NeurIPS 2020 • Khashayar Gatmiry, Maryam Aliakbarpour, Stefanie Jegelka
Determinantal point processes (DPPs) are popular probabilistic models of diversity.
no code implementations • NeurIPS 2019 • Maryam Aliakbarpour, Ilias Diakonikolas, Daniel Kane, Ronitt Rubinfeld
In this paper, we use the framework of property testing to design algorithms to test the properties of the distribution that the data is drawn from with respect to differential privacy.
no code implementations • 17 Nov 2019 • Maryam Aliakbarpour, Sandeep Silwal
We propose a new setting for testing properties of distributions while receiving samples from several distributions, but few samples per distribution.
no code implementations • 6 Jul 2019 • Maryam Aliakbarpour, Ravi Kumar, Ronitt Rubinfeld
In our model, the noisy distribution is a mixture of the original distribution and noise, where the latter is known to the tester either explicitly or via sample access; the form of the noise is also known a priori.
no code implementations • 6 Jul 2019 • Maryam Aliakbarpour, Themis Gouleakis, John Peebles, Ronitt Rubinfeld, Anak Yodpinyanee
We then build on these lower bounds to give $\Omega(n/\log{n})$ lower bounds for testing monotonicity over a matching poset of size $n$ and significantly improved lower bounds over the hypercube poset.
no code implementations • ICML 2018 • Maryam Aliakbarpour, Ilias Diakonikolas, Ronitt Rubinfeld
Our theoretical results significantly improve over the best known algorithms for identity testing, and are the first results for private equivalence testing.
no code implementations • 18 Jul 2017 • Maryam Aliakbarpour, Ilias Diakonikolas, Ronitt Rubinfeld
We investigate the problems of identity and closeness testing over a discrete population from random samples.