Search Results for author: Allen Liu

Found 14 papers, 0 papers with code

Tight Bounds for Quantum State Certification with Incoherent Measurements

no code implementations14 Apr 2022 Sitan Chen, Brice Huang, Jerry Li, Allen Liu

When $\sigma$ is the maximally mixed state $\frac{1}{d} I_d$, this is known as mixedness testing.

Semi-Random Sparse Recovery in Nearly-Linear Time

no code implementations8 Mar 2022 Jonathan A. Kelner, Jerry Li, Allen Liu, Aaron Sidford, Kevin Tian

We design a new iterative method tailored to the geometry of sparse recovery which is provably robust to our semi-random model.

The Pareto Frontier of Instance-Dependent Guarantees in Multi-Player Multi-Armed Bandits with no Communication

no code implementations19 Feb 2022 Allen Liu, Mark Sellke

We ask whether it is possible to obtain optimal instance-dependent regret $\tilde{O}(1/\Delta)$ where $\Delta$ is the gap between the $m$-th and $m+1$-st best arms.

Multi-Armed Bandits

Robust Voting Rules from Algorithmic Robust Statistics

no code implementations13 Dec 2021 Allen Liu, Ankur Moitra

In this work we study the problem of robustly learning a Mallows model.

Clustering Mixtures with Almost Optimal Separation in Polynomial Time

no code implementations1 Dec 2021 Jerry Li, Allen Liu

We give the first algorithm which runs in polynomial time, and which almost matches this guarantee.

Margin-Independent Online Multiclass Learning via Convex Geometry

no code implementations NeurIPS 2021 Guru Guruganesh, Allen Liu, Jon Schneider, Joshua Wang

We consider the problem of multi-class classification, where a stream of adversarially chosen queries arrive and must be assigned a label online.

Multi-class Classification

Sparsification for Sums of Exponentials and its Algorithmic Applications

no code implementations5 Jun 2021 Jerry Li, Allen Liu, Ankur Moitra

In particular, we give the first algorithms for approximately (and robustly) determining the number of components in a Gaussian mixture model that work without a separation condition.

Learning Theory Model Selection

Algorithms from Invariants: Smoothed Analysis of Orbit Recovery over $SO(3)$

no code implementations4 Jun 2021 Allen Liu, Ankur Moitra

Our main result is a quasi-polynomial time algorithm for orbit recovery over $SO(3)$ in this model.

Electron Tomography Tensor Decomposition

Learning GMMs with Nearly Optimal Robustness Guarantees

no code implementations19 Apr 2021 Allen Liu, Ankur Moitra

In this work we solve the problem of robustly learning a high-dimensional Gaussian mixture model with $k$ components from $\epsilon$-corrupted samples up to accuracy $\widetilde{O}(\epsilon)$ in total variation distance for any constant $k$ and with mild assumptions on the mixture.

Myersonian Regression

no code implementations NeurIPS 2020 Allen Liu, Renato Leme, Jon Schneider

Motivated by pricing applications in online advertising, we study a variant of linear regression with a discontinuous loss function that we term Myersonian regression.

Settling the Robust Learnability of Mixtures of Gaussians

no code implementations6 Nov 2020 Allen Liu, Ankur Moitra

This work represents a natural coalescence of two important lines of work: learning mixtures of Gaussians and algorithmic robust statistics.

Tensor Completion Made Practical

no code implementations NeurIPS 2020 Allen Liu, Ankur Moitra

We show strong provable guarantees, including showing that our algorithm converges linearly to the true tensors even when the factors are highly correlated and can be implemented in nearly linear time.

Matrix Completion

Optimal Contextual Pricing and Extensions

no code implementations3 Mar 2020 Allen Liu, Renato Paes Leme, Jon Schneider

We provide a generic algorithm with $O(d^2)$ regret where $d$ is the covering dimension of this class.

Efficiently Learning Mixtures of Mallows Models

no code implementations17 Aug 2018 Allen Liu, Ankur Moitra

Mixtures of Mallows models are a popular generative model for ranking data coming from a heterogeneous population.

Recommendation Systems

Cannot find the paper you are looking for? You can Submit a new open access paper.