Statistical inference for high dimensional regression via Constrained Lasso

17 Apr 2017  ·  Yun Yang ·

In this paper, we propose a new method for estimation and constructing confidence intervals for low-dimensional components in a high-dimensional model. The proposed estimator, called Constrained Lasso (CLasso) estimator, is obtained by simultaneously solving two estimating equations---one imposing a zero-bias constraint for the low-dimensional parameter and the other forming an $\ell_1$-penalized procedure for the high-dimensional nuisance parameter. By carefully choosing the zero-bias constraint, the resulting estimator of the low dimensional parameter is shown to admit an asymptotically normal limit attaining the Cram\'{e}r-Rao lower bound in a semiparametric sense. We propose a tuning-free iterative algorithm for implementing the CLasso. We show that when the algorithm is initialized at the Lasso estimator, the de-sparsified estimator proposed in van de Geer et al. [\emph{Ann. Statist.} {\bf 42} (2014) 1166--1202] is asymptotically equivalent to the first iterate of the algorithm. We analyse the asymptotic properties of the CLasso estimator and show the globally linear convergence of the algorithm. We also demonstrate encouraging empirical performance of the CLasso through numerical studies.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here