3 code implementations • 7 Oct 2016 • Emmanuel Candes, Yingying Fan, Lucas Janson, Jinchi Lv

Whereas the knockoffs procedure is constrained to homoscedastic linear models with $n\ge p$, the key innovation here is that model-X knockoffs provide valid inference from finite samples in settings in which the conditional distribution of the response is arbitrary and completely unknown.

Methodology Statistics Theory Applications Statistics Theory

no code implementations • 19 Sep 2016 • Yuxin Chen, Emmanuel Candes

We prove that for a broad class of statistical models, the proposed projected power method makes no error---and hence converges to the maximum likelihood estimate---in a suitable regime.

3 code implementations • 5 Nov 2015 • Weijie Su, Malgorzata Bogdan, Emmanuel Candes

In regression settings where explanatory variables have very low correlations and there are relatively few effects, each of large magnitude, we expect the Lasso to find the important variables with few errors, if any.

no code implementations • 29 Mar 2015 • Weijie Su, Emmanuel Candes

We consider high-dimensional sparse regression problems in which we observe $y = X \beta + z$, where $X$ is an $n \times p$ design matrix and $z$ is an $n$-dimensional vector of independent Gaussian errors, each with variance $\sigma^2$.

Statistics Theory Information Theory Information Theory Statistics Theory

no code implementations • NeurIPS 2014 • Weijie Su, Stephen Boyd, Emmanuel Candes

We derive a second-order ordinary differential equation (ODE), which is the limit of Nesterov’s accelerated gradient method.

1 code implementation • 14 Jan 2010 • Zihan Zhou, XiaoDong Li, John Wright, Emmanuel Candes, Yi Ma

We further prove that the solution to a related convex program (a relaxed PCP) gives an estimate of the low-rank matrix that is simultaneously stable to small entrywise noise and robust to gross sparse errors.

Information Theory Information Theory

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.