Search Results for author: Hong Hu

Found 7 papers, 2 papers with code

Asymptotics of Random Feature Regression Beyond the Linear Scaling Regime

no code implementations13 Mar 2024 Hong Hu, Yue M. Lu, Theodor Misiakiewicz

On the other hand, if $p = o(n)$, the number of random features $p$ is the limiting factor and RFRR test error matches the approximation error of the random feature model class (akin to taking $n = \infty$).

regression

Homophily modulates double descent generalization in graph convolution networks

1 code implementation26 Dec 2022 Cheng Shi, Liming Pan, Hong Hu, Ivan Dokmanić

Motivated by experimental observations of ``transductive'' double descent in key networks and datasets, we use analytical tools from statistical physics and random matrix theory to precisely characterize generalization in simple graph convolution networks on the contextual stochastic block model.

Graph Learning Learning Theory +1

Precise Learning Curves and Higher-Order Scaling Limits for Dot Product Kernel Regression

no code implementations30 May 2022 Lechao Xiao, Hong Hu, Theodor Misiakiewicz, Yue M. Lu, Jeffrey Pennington

As modern machine learning models continue to advance the computational frontier, it has become increasingly important to develop precise estimates for expected performance improvements under different model and data scaling regimes.

regression

Sharp Asymptotics of Kernel Ridge Regression Beyond the Linear Regime

no code implementations13 May 2022 Hong Hu, Yue M. Lu

The generalization performance of kernel ridge regression (KRR) exhibits a multi-phased pattern that crucially depends on the scaling relationship between the sample size $n$ and the underlying dimension $d$.

regression

SLOPE for Sparse Linear Regression:Asymptotics and Optimal Regularization

1 code implementation27 Mar 2019 Hong Hu, Yue M. Lu

In sparse linear regression, the SLOPE estimator generalizes LASSO by penalizing different coordinates of the estimate according to their magnitudes.

Information Theory Information Theory Statistics Theory Statistics Theory

A Solvable High-Dimensional Model of GAN

no code implementations NeurIPS 2019 Chuang Wang, Hong Hu, Yue M. Lu

We present a theoretical analysis of the training process for a single-layer GAN fed by high-dimensional input data.

Vocal Bursts Intensity Prediction

Cannot find the paper you are looking for? You can Submit a new open access paper.