Search Results for author: David Steurer

Found 19 papers, 0 papers with code

Fast algorithm for overcomplete order-3 tensor decomposition

no code implementations14 Feb 2022 Jingqiu Ding, Tommaso d'Orsi, Chih-Hung Liu, Stefan Tiegel, David Steurer

We develop the first fast spectral algorithm to decompose a random third-order tensor over $\mathbb{R}^d$ of rank up to $O(d^{3/2}/\text{polylog}(d))$.

Tensor Decomposition Tensor Networks

Beyond Parallel Pancakes: Quasi-Polynomial Time Guarantees for Non-Spherical Gaussian Mixtures

no code implementations10 Dec 2021 Rares-Darius Buhai, David Steurer

The reason is that such outliers can simulate exponentially small mixing weights even for mixtures with polynomially lower bounded mixing weights.

Robust recovery for stochastic block models

no code implementations16 Nov 2021 Jingqiu Ding, Tommaso d'Orsi, Rajai Nasser, David Steurer

We develop an efficient algorithm for weak recovery in a robust version of the stochastic block model.

Stochastic Block Model

Consistent Estimation for PCA and Sparse Regression with Oblivious Outliers

no code implementations NeurIPS 2021 Tommaso d'Orsi, Chih-Hung Liu, Rajai Nasser, Gleb Novikov, David Steurer, Stefan Tiegel

For sparse regression, we achieve consistency for optimal sample size $n\gtrsim (k\log d)/\alpha^2$ and optimal error rate $O(\sqrt{(k\log d)/(n\cdot \alpha^2)})$ where $n$ is the number of observations, $d$ is the number of dimensions and $k$ is the sparsity of the parameter vector, allowing the fraction of inliers to be inverse-polynomial in the number of samples.

Matrix Completion

SoS Degree Reduction with Applications to Clustering and Robust Moment Estimation

no code implementations5 Jan 2021 David Steurer, Stefan Tiegel

We develop a general framework to significantly reduce the degree of sum-of-squares proofs by introducing new variables.

Sparse PCA: Algorithms, Adversarial Perturbations and Certificates

no code implementations12 Nov 2020 Tommaso d'Orsi, Pravesh K. Kothari, Gleb Novikov, David Steurer

Despite a long history of prior works, including explicit studies of perturbation resilience, the best known algorithmic guarantees for Sparse PCA are fragile and break down under small adversarial perturbations.

Consistent regression when oblivious outliers overwhelm

no code implementations30 Sep 2020 Tommaso d'Orsi, Gleb Novikov, David Steurer

Concretely, we show that the Huber loss estimator is consistent for every sample size $n= \omega(d/\alpha^2)$ and achieves an error rate of $O(d/\alpha^2n)^{1/2}$.

Estimating Rank-One Spikes from Heavy-Tailed Noise via Self-Avoiding Walks

no code implementations NeurIPS 2020 Jingqiu Ding, Samuel B. Hopkins, David Steurer

For the case of Gaussian noise, the top eigenvector of the given matrix is a widely-studied estimator known to achieve optimal statistical guarantees, e. g., in the sense of the celebrated BBP phase transition.

High-dimensional estimation via sum-of-squares proofs

no code implementations30 Jul 2018 Prasad Raghavendra, Tselil Schramm, David Steurer

On one hand, there is a growing body of work utilizing sum-of-squares proofs for recovering solutions to polynomial systems when the system is feasible.

Outlier-robust moment-estimation via sum-of-squares

no code implementations30 Nov 2017 Pravesh K. Kothari, David Steurer

We develop efficient algorithms for estimating low-degree moments of unknown distributions in the presence of adversarial outliers.

Bayesian estimation from few samples: community detection and related problems

no code implementations30 Sep 2017 Samuel B. Hopkins, David Steurer

in constant average degree graphs---up to what we conjecture to be the computational threshold for this model.

Community Detection Stochastic Block Model +1

Fast and robust tensor decomposition with applications to dictionary learning

no code implementations27 Jun 2017 Tselil Schramm, David Steurer

We develop fast spectral algorithms for tensor decomposition that match the robustness guarantees of the best known polynomial-time algorithms for this problem based on the sum-of-squares (SOS) semidefinite programming hierarchy.

Dictionary Learning Tensor Decomposition

Exact tensor completion with sum-of-squares

no code implementations21 Feb 2017 Aaron Potechin, David Steurer

We obtain the first polynomial-time algorithm for exact tensor completion that improves over the bound implied by reduction to matrix completion.

Matrix Completion

Polynomial-time Tensor Decompositions with Sum-of-Squares

no code implementations6 Oct 2016 Tengyu Ma, Jonathan Shi, David Steurer

We give new algorithms based on the sum-of-squares method for tensor decomposition.

Tensor Decomposition

Fast spectral algorithms from sum-of-squares proofs: tensor decomposition and planted sparse vectors

no code implementations8 Dec 2015 Samuel B. Hopkins, Tselil Schramm, Jonathan Shi, David Steurer

For tensor decomposition, we give an algorithm with running time close to linear in the input size (with exponent $\approx 1. 086$) that approximately recovers a component of a random 3-tensor over $\mathbb R^n$ of rank up to $\tilde \Omega(n^{4/3})$.

Tensor Decomposition

Tensor principal component analysis via sum-of-squares proofs

no code implementations12 Jul 2015 Samuel B. Hopkins, Jonathan Shi, David Steurer

We study a statistical model for the tensor principal component analysis problem introduced by Montanari and Richard: Given a order-$3$ tensor $T$ of the form $T = \tau \cdot v_0^{\otimes 3} + A$, where $\tau \geq 0$ is a signal-to-noise ratio, $v_0$ is a unit vector, and $A$ is a random noise tensor, the goal is to recover the planted vector $v_0$.

Dictionary Learning and Tensor Decomposition via the Sum-of-Squares Method

no code implementations6 Jul 2014 Boaz Barak, Jonathan A. Kelner, David Steurer

We give a new approach to the dictionary learning (also known as "sparse coding") problem of recovering an unknown $n\times m$ matrix $A$ (for $m \geq n$) from examples of the form \[ y = Ax + e, \] where $x$ is a random vector in $\mathbb R^m$ with at most $\tau m$ nonzero coordinates, and $e$ is a random noise vector in $\mathbb R^n$ with bounded magnitude.

Dictionary Learning Tensor Decomposition

Sum-of-squares proofs and the quest toward optimal algorithms

no code implementations21 Apr 2014 Boaz Barak, David Steurer

Two recent developments, the Unique Games Conjecture (UGC) and the Sum-of-Squares (SOS) method, surprisingly suggest that this tailoring is not necessary and that a single efficient algorithm could achieve best possible guarantees for a wide range of different problems.

Rounding Sum-of-Squares Relaxations

no code implementations23 Dec 2013 Boaz Barak, Jonathan Kelner, David Steurer

Aside from being a natural relaxation, this is also motivated by a connection to the Small Set Expansion problem shown by Barak et al. (STOC 2012) and our results yield a certain improvement for that problem.

Cannot find the paper you are looking for? You can Submit a new open access paper.