Search Results for author: Rajai Nasser

Found 4 papers, 0 papers with code

Higher degree sum-of-squares relaxations robust against oblivious outliers

no code implementations14 Nov 2022 Tommaso d'Orsi, Rajai Nasser, Gleb Novikov, David Steurer

Using a reduction from the planted clique problem, we provide evidence that the quasipolynomial time is likely to be necessary for sparse PCA with symmetric noise.

Optimal SQ Lower Bounds for Learning Halfspaces with Massart Noise

no code implementations24 Jan 2022 Rajai Nasser, Stefan Tiegel

Further, this continues to hold even if the information-theoretically optimal error $\mathrm{OPT}$ is as small as $\exp\left(-\log^c(d)\right)$, where $d$ is the dimension and $0 < c < 1$ is an arbitrary absolute constant, and an overwhelming fraction of examples are noiseless.

Robust recovery for stochastic block models

no code implementations16 Nov 2021 Jingqiu Ding, Tommaso d'Orsi, Rajai Nasser, David Steurer

We develop an efficient algorithm for weak recovery in a robust version of the stochastic block model.

Stochastic Block Model

Consistent Estimation for PCA and Sparse Regression with Oblivious Outliers

no code implementations NeurIPS 2021 Tommaso d'Orsi, Chih-Hung Liu, Rajai Nasser, Gleb Novikov, David Steurer, Stefan Tiegel

For sparse regression, we achieve consistency for optimal sample size $n\gtrsim (k\log d)/\alpha^2$ and optimal error rate $O(\sqrt{(k\log d)/(n\cdot \alpha^2)})$ where $n$ is the number of observations, $d$ is the number of dimensions and $k$ is the sparsity of the parameter vector, allowing the fraction of inliers to be inverse-polynomial in the number of samples.

Matrix Completion regression

Cannot find the paper you are looking for? You can Submit a new open access paper.