SCAD-penalized regression in high-dimensional partially linear models

31 Mar 2009  ·  Huiliang Xie, Jian Huang ·

We consider the problem of simultaneous variable selection and estimation in partially linear models with a divergent number of covariates in the linear part, under the assumption that the vector of regression coefficients is sparse. We apply the SCAD penalty to achieve sparsity in the linear part and use polynomial splines to estimate the nonparametric component. Under reasonable conditions, it is shown that consistency in terms of variable selection and estimation can be achieved simultaneously for the linear and nonparametric components. Furthermore, the SCAD-penalized estimators of the nonzero coefficients are shown to have the asymptotic oracle property, in the sense that it is asymptotically normal with the same means and covariances that they would have if the zero coefficients were known in advance. The finite sample behavior of the SCAD-penalized estimators is evaluated with simulation and illustrated with a data set.

PDF Abstract
No code implementations yet. Submit your code now

Categories


Statistics Theory Statistics Theory 62J05, 62G08 (Primary) 62E20 (Secondary)