Search Results for author: Jin-Hong Du

Found 7 papers, 3 papers with code

Causal Inference for Genomic Data with Multiple Heterogeneous Outcomes

no code implementations14 Apr 2024 Jin-Hong Du, Zhenghao Zeng, Edward H. Kennedy, Larry Wasserman, Kathryn Roeder

In this paper, we propose a generic semiparametric inference framework for doubly robust estimation with multiple derived outcomes, which also encompasses the usual setting of multiple outcomes when the response of each unit is available.

Causal Inference

Optimal Ridge Regularization for Out-of-Distribution Prediction

1 code implementation1 Apr 2024 Pratik Patil, Jin-Hong Du, Ryan J. Tibshirani

We study the behavior of optimal ridge regularization and optimal ridge risk for out-of-distribution prediction, where the test distribution deviates arbitrarily from the train distribution.

regression

Corrected generalized cross-validation for finite ensembles of penalized estimators

1 code implementation2 Oct 2023 Pierre C. Bellec, Jin-Hong Du, Takuya Koriyama, Pratik Patil, Kai Tan

We provide a non-asymptotic analysis of the CGCV and the two intermediate risk estimators for ensembles of convex penalized estimators under Gaussian features and a linear response model.

Simultaneous inference for generalized linear models with unmeasured confounders

1 code implementation13 Sep 2023 Jin-Hong Du, Larry Wasserman, Kathryn Roeder

Tens of thousands of simultaneous hypothesis tests are routinely performed in genomic studies to identify differentially expressed genes.

Subsample Ridge Ensembles: Equivalences and Generalized Cross-Validation

no code implementations25 Apr 2023 Jin-Hong Du, Pratik Patil, Arun Kumar Kuchibhotla

We study subsampling-based ridge ensembles in the proportional asymptotics regime, where the feature size grows proportionally with the sample size such that their ratio converges to a constant.

Extrapolated cross-validation for randomized ensembles

no code implementations27 Feb 2023 Jin-Hong Du, Pratik Patil, Kathryn Roeder, Arun Kumar Kuchibhotla

By establishing uniform consistency of our risk extrapolation technique over ensemble and subsample sizes, we show that ECV yields $\delta$-optimal (with respect to the oracle-tuned risk) ensembles for squared prediction risk.

Bagging in overparameterized learning: Risk characterization and risk monotonization

no code implementations20 Oct 2022 Pratik Patil, Jin-Hong Du, Arun Kumar Kuchibhotla

Bagging is a commonly used ensemble technique in statistics and machine learning to improve the performance of prediction procedures.

Cannot find the paper you are looking for? You can Submit a new open access paper.