Search Results for author: Alistair Stewart

Found 21 papers, 5 papers with code

Outlier-Robust Learning of Ising Models Under Dobrushin's Condition

no code implementations3 Feb 2021 Ilias Diakonikolas, Daniel M. Kane, Alistair Stewart, Yuxin Sun

We study the problem of learning Ising models satisfying Dobrushin's condition in the outlier-robust setting where a constant fraction of the samples are adversarially corrupted.

Testing for Families of Distributions via the Fourier Transform

no code implementations NeurIPS 2018 Alistair Stewart, Ilias Diakonikolas, Clement Canonne

We study the general problem of testing whether an unknown discrete distribution belongs to a specified family of distributions.

Two-sample testing

Efficient Algorithms and Lower Bounds for Robust Linear Regression

no code implementations31 May 2018 Ilias Diakonikolas, Weihao Kong, Alistair Stewart

An error of $\Omega (\epsilon \sigma)$ is information-theoretically necessary, even with infinite sample size.

regression

Sever: A Robust Meta-Algorithm for Stochastic Optimization

1 code implementation7 Mar 2018 Ilias Diakonikolas, Gautam Kamath, Daniel M. Kane, Jerry Li, Jacob Steinhardt, Alistair Stewart

In high dimensions, most machine learning methods are brittle to even a small fraction of structured outliers.

Stochastic Optimization

List-Decodable Robust Mean Estimation and Learning Mixtures of Spherical Gaussians

no code implementations20 Nov 2017 Ilias Diakonikolas, Daniel M. Kane, Alistair Stewart

We give a learning algorithm for mixtures of spherical Gaussians that succeeds under significantly weaker separation assumptions compared to prior work.

Sharp Bounds for Generalized Uniformity Testing

no code implementations NeurIPS 2018 Ilias Diakonikolas, Daniel M. Kane, Alistair Stewart

We study the problem of generalized uniformity testing \cite{BC17} of a discrete probability distribution: Given samples from a probability distribution $p$ over an {\em unknown} discrete domain $\mathbf{\Omega}$, we want to distinguish, with probability at least $2/3$, between the case that $p$ is uniform on some {\em subset} of $\mathbf{\Omega}$ versus $\epsilon$-far, in total variation distance, from any such uniform distribution.

Learning Geometric Concepts with Nasty Noise

no code implementations5 Jul 2017 Ilias Diakonikolas, Daniel M. Kane, Alistair Stewart

We give the first polynomial-time PAC learning algorithms for these concept classes with dimension-independent error guarantees in the presence of nasty noise under the Gaussian distribution.

LEMMA Outlier Detection +1

Robustly Learning a Gaussian: Getting Optimal Error, Efficiently

no code implementations12 Apr 2017 Ilias Diakonikolas, Gautam Kamath, Daniel M. Kane, Jerry Li, Ankur Moitra, Alistair Stewart

We give robust estimators that achieve estimation error $O(\varepsilon)$ in the total variation distance, which is optimal up to a universal constant that is independent of the dimension.

Being Robust (in High Dimensions) Can Be Practical

2 code implementations ICML 2017 Ilias Diakonikolas, Gautam Kamath, Daniel M. Kane, Jerry Li, Ankur Moitra, Alistair Stewart

Robust estimation is much more challenging in high dimensions than it is in one dimension: Most techniques either lead to intractable optimization problems or estimators that can tolerate only a tiny fraction of errors.

Vocal Bursts Intensity Prediction

Testing Bayesian Networks

no code implementations9 Dec 2016 Clement Canonne, Ilias Diakonikolas, Daniel Kane, Alistair Stewart

This work initiates a systematic investigation of testing high-dimensional structured distributions by focusing on testing Bayesian networks -- the prototypical family of directed graphical models.

Statistical Query Lower Bounds for Robust Estimation of High-dimensional Gaussians and Gaussian Mixtures

no code implementations10 Nov 2016 Ilias Diakonikolas, Daniel M. Kane, Alistair Stewart

For each of these problems, we show a {\em super-polynomial gap} between the (information-theoretic) sample complexity and the computational complexity of {\em any} Statistical Query algorithm for the problem.

Robust Learning of Fixed-Structure Bayesian Networks

1 code implementation NeurIPS 2018 Yu Cheng, Ilias Diakonikolas, Daniel Kane, Alistair Stewart

We investigate the problem of learning Bayesian networks in a robust model where an $\epsilon$-fraction of the samples are adversarially corrupted.

Efficient Robust Proper Learning of Log-concave Distributions

no code implementations9 Jun 2016 Ilias Diakonikolas, Daniel M. Kane, Alistair Stewart

We study the {\em robust proper learning} of univariate log-concave distributions (over continuous and discrete domains).

Learning Multivariate Log-concave Distributions

no code implementations26 May 2016 Ilias Diakonikolas, Daniel M. Kane, Alistair Stewart

Prior to our work, no upper bound on the sample complexity of this learning problem was known for the case of $d>3$.

Robust Estimators in High Dimensions without the Computational Intractability

2 code implementations21 Apr 2016 Ilias Diakonikolas, Gautam Kamath, Daniel Kane, Jerry Li, Ankur Moitra, Alistair Stewart

We study high-dimensional distribution learning in an agnostic setting where an adversary is allowed to arbitrarily corrupt an $\varepsilon$-fraction of the samples.

Vocal Bursts Intensity Prediction

Properly Learning Poisson Binomial Distributions in Almost Polynomial Time

no code implementations12 Nov 2015 Ilias Diakonikolas, Daniel M. Kane, Alistair Stewart

Given $\widetilde{O}(1/\epsilon^2)$ samples from an unknown PBD $\mathbf{p}$, our algorithm runs in time $(1/\epsilon)^{O(\log \log (1/\epsilon))}$, and outputs a hypothesis PBD that is $\epsilon$-close to $\mathbf{p}$ in total variation distance.

The Fourier Transform of Poisson Multinomial Distributions and its Algorithmic Applications

no code implementations11 Nov 2015 Ilias Diakonikolas, Daniel M. Kane, Alistair Stewart

An $(n, k)$-Poisson Multinomial Distribution (PMD) is a random variable of the form $X = \sum_{i=1}^n X_i$, where the $X_i$'s are independent random vectors supported on the set of standard basis vectors in $\mathbb{R}^k.$ In this paper, we obtain a refined structural understanding of PMDs by analyzing their Fourier transform.

Learning Theory

Optimal Learning via the Fourier Transform for Sums of Independent Integer Random Variables

no code implementations4 May 2015 Ilias Diakonikolas, Daniel M. Kane, Alistair Stewart

As one of our main structural contributions, we give an efficient algorithm to construct a sparse {\em proper} $\epsilon$-cover for ${\cal S}_{n, k},$ in total variation distance.

Cannot find the paper you are looking for? You can Submit a new open access paper.