no code implementations • 14 Apr 2024 • Jin-Hong Du, Zhenghao Zeng, Edward H. Kennedy, Larry Wasserman, Kathryn Roeder
In this paper, we propose a generic semiparametric inference framework for doubly robust estimation with multiple derived outcomes, which also encompasses the usual setting of multiple outcomes when the response of each unit is available.
1 code implementation • 1 Apr 2024 • Pratik Patil, Jin-Hong Du, Ryan J. Tibshirani
We study the behavior of optimal ridge regularization and optimal ridge risk for out-of-distribution prediction, where the test distribution deviates arbitrarily from the train distribution.
1 code implementation • 2 Oct 2023 • Pierre C. Bellec, Jin-Hong Du, Takuya Koriyama, Pratik Patil, Kai Tan
We provide a non-asymptotic analysis of the CGCV and the two intermediate risk estimators for ensembles of convex penalized estimators under Gaussian features and a linear response model.
1 code implementation • 13 Sep 2023 • Jin-Hong Du, Larry Wasserman, Kathryn Roeder
Tens of thousands of simultaneous hypothesis tests are routinely performed in genomic studies to identify differentially expressed genes.
no code implementations • 25 Apr 2023 • Jin-Hong Du, Pratik Patil, Arun Kumar Kuchibhotla
We study subsampling-based ridge ensembles in the proportional asymptotics regime, where the feature size grows proportionally with the sample size such that their ratio converges to a constant.
no code implementations • 27 Feb 2023 • Jin-Hong Du, Pratik Patil, Kathryn Roeder, Arun Kumar Kuchibhotla
By establishing uniform consistency of our risk extrapolation technique over ensemble and subsample sizes, we show that ECV yields $\delta$-optimal (with respect to the oracle-tuned risk) ensembles for squared prediction risk.
no code implementations • 20 Oct 2022 • Pratik Patil, Jin-Hong Du, Arun Kumar Kuchibhotla
Bagging is a commonly used ensemble technique in statistics and machine learning to improve the performance of prediction procedures.