Search Results for author: Hassan Ashtiani

Found 17 papers, 2 papers with code

Sample-Optimal Locally Private Hypothesis Selection and the Provable Benefits of Interactivity

no code implementations9 Dec 2023 Alireza F. Pour, Hassan Ashtiani, Shahab Asoodeh

Namely, it breaks the known lower bound of $\Omega\left(\frac{k\log k}{\alpha^2\min \{ \varepsilon^2 , 1\}} \right)$ for the sample complexity of non-interactive hypothesis selection.

Mixtures of Gaussians are Privately Learnable with a Polynomial Number of Samples

no code implementations7 Sep 2023 Mohammad Afzali, Hassan Ashtiani, Christopher Liaw

We study the problem of estimating mixtures of Gaussians under the constraint of differential privacy (DP).

Polynomial Time and Private Learning of Unbounded Gaussian Mixture Models

no code implementations7 Mar 2023 Jamil Arbas, Hassan Ashtiani, Christopher Liaw

We study the problem of privately estimating the parameters of $d$-dimensional Gaussian Mixture Models (GMMs) with $k$ components.

Benefits of Additive Noise in Composing Classes with Bounded Capacity

1 code implementation14 Jun 2022 Alireza Fathollah Pour, Hassan Ashtiani

We observe that given two (compatible) classes of functions $\mathcal{F}$ and $\mathcal{H}$ with small capacity as measured by their uniform covering numbers, the capacity of the composition class $\mathcal{H} \circ \mathcal{F}$ can become prohibitively large or even unbounded.

Adversarially Robust Learning with Tolerance

no code implementations2 Mar 2022 Hassan Ashtiani, Vinayak Pathak, Ruth Urner

In the tolerant version, the error of the learner is compared with the best achievable error with respect to a slightly larger perturbation radius $(1+\gamma)r$.

PAC learning

Private and polynomial time algorithms for learning Gaussians and beyond

no code implementations22 Nov 2021 Hassan Ashtiani, Christopher Liaw

As another application of our framework, we provide the first polynomial time $(\varepsilon, \delta)$-DP algorithm for robust learning of (unrestricted) Gaussians with sample complexity $\widetilde{O}(d^{3. 5})$.

Privately Learning Mixtures of Axis-Aligned Gaussians

no code implementations NeurIPS 2021 Ishaq Aden-Ali, Hassan Ashtiani, Christopher Liaw

We show that if $\mathcal{F}$ is privately list-decodable, then we can privately learn mixtures of distributions in $\mathcal{F}$.

On the Sample Complexity of Privately Learning Unbounded High-Dimensional Gaussians

no code implementations19 Oct 2020 Ishaq Aden-Ali, Hassan Ashtiani, Gautam Kamath

These are the first finite sample upper bounds for general Gaussians which do not impose restrictions on the parameters of the distribution.

Vocal Bursts Intensity Prediction

Black-box Certification and Learning under Adversarial Perturbations

no code implementations ICML 2020 Hassan Ashtiani, Vinayak Pathak, Ruth Urner

We formally study the problem of classification under adversarial perturbations from a learner's perspective as well as a third-party who aims at certifying the robustness of a given black-box classifier.

On the Sample Complexity of Learning Sum-Product Networks

no code implementations5 Dec 2019 Ishaq Aden-Ali, Hassan Ashtiani

We show that the sample complexity of learning tree structured SPNs with the usual type of leaves (i. e., Gaussian or discrete) grows at most linearly (up to logarithmic factors) with the number of parameters of the SPN.

PAC learning

Disentangled behavioural representations

1 code implementation NeurIPS 2019 Amir Dezfouli, Hassan Ashtiani, Omar Ghattas, Richard Nock, Peter Dayan, Cheng Soon Ong

Individual characteristics in human decision-making are often quantified by fitting a parametric cognitive model to subjects' behavior and then studying differences between them in the associated parameter space.

Decision Making

Nearly tight sample complexity bounds for learning mixtures of Gaussians via sample compression schemes

no code implementations NeurIPS 2018 Hassan Ashtiani, Shai Ben-David, Nicholas Harvey, Christopher Liaw, Abbas Mehrabian, Yaniv Plan

We prove that ϴ(k d^2 / ε^2) samples are necessary and sufficient for learning a mixture of k Gaussians in R^d, up to error ε in total variation distance.

Some techniques in density estimation

no code implementations11 Jan 2018 Hassan Ashtiani, Abbas Mehrabian

Density estimation is an interdisciplinary topic at the intersection of statistics, theoretical computer science and machine learning.

BIG-bench Machine Learning Density Estimation

Near-optimal Sample Complexity Bounds for Robust Learning of Gaussians Mixtures via Compression Schemes

no code implementations14 Oct 2017 Hassan Ashtiani, Shai Ben-David, Nick Harvey, Christopher Liaw, Abbas Mehrabian, Yaniv Plan

We prove that $\tilde{\Theta}(k d^2 / \varepsilon^2)$ samples are necessary and sufficient for learning a mixture of $k$ Gaussians in $\mathbb{R}^d$, up to error $\varepsilon$ in total variation distance.

Sample-Efficient Learning of Mixtures

no code implementations6 Jun 2017 Hassan Ashtiani, Shai Ben-David, Abbas Mehrabian

Let $\mathcal F$ be an arbitrary class of probability distributions, and let $\mathcal{F}^k$ denote the class of $k$-mixtures of elements of $\mathcal F$.

Density Estimation PAC learning

Clustering with Same-Cluster Queries

no code implementations NeurIPS 2016 Hassan Ashtiani, Shrinu Kushagra, Shai Ben-David

We show that there is a trade off between computational complexity and query complexity; We prove that for the case of $k$-means clustering (i. e., when the expert conforms to a solution of $k$-means), having access to relatively few such queries allows efficient solutions to otherwise NP hard problems.

Clustering

Representation Learning for Clustering: A Statistical Framework

no code implementations19 Jun 2015 Hassan Ashtiani, Shai Ben-David

The algorithm designer then uses that sample to come up with a data representation under which $k$-means clustering results in a clustering (of the full data set) that is aligned with the user's clustering.

Clustering Representation Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.