De-Biased Machine Learning of Global and Local Parameters Using Regularized Riesz Representers

23 Feb 2018  ·  Victor Chernozhukov, Whitney Newey, Rahul Singh ·

We provide adaptive inference methods, based on $\ell_1$ regularization, for regular (semiparametric) and non-regular (nonparametric) linear functionals of the conditional expectation function. Examples of regular functionals include average treatment effects, policy effects, and derivatives... Examples of non-regular functionals include average treatment effects, policy effects, and derivatives conditional on a covariate subvector fixed at a point. We construct a Neyman orthogonal equation for the target parameter that is approximately invariant to small perturbations of the nuisance parameters. To achieve this property, we include the Riesz representer for the functional as an additional nuisance parameter. Our analysis yields weak "double sparsity robustness": either the approximation to the regression or the approximation to the representer can be "completely dense" as long as the other is sufficiently "sparse". Our main results are non-asymptotic and imply asymptotic uniform validity over large classes of models, translating into honest confidence bands for both global and local parameters. read more

PDF Abstract
No code implementations yet. Submit your code now



  Add Datasets introduced or used in this paper

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.