A Proof of Orthogonal Double Machine Learning with $Z$-Estimators

12 Apr 2017  ·  Vasilis Syrgkanis ·

We consider two stage estimation with a non-parametric first stage and a generalized method of moments second stage, in a simpler setting than (Chernozhukov et al. 2016). We give an alternative proof of the theorem given in (Chernozhukov et al. 2016) that orthogonal second stage moments, sample splitting and $n^{1/4}$-consistency of the first stage, imply $\sqrt{n}$-consistency and asymptotic normality of second stage estimates. Our proof is for a variant of their estimator, which is based on the empirical version of the moment condition (Z-estimator), rather than a minimization of a norm of the empirical vector of moments (M-estimator). This note is meant primarily for expository purposes, rather than as a new technical contribution.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here