Paper

Goodness-of-fit tests on manifolds

We develop a general theory for the goodness-of-fit test to non-linear models. In particular, we assume that the observations are noisy samples of a submanifold defined by a \yao{sufficiently smooth non-linear map}. The observation noise is additive Gaussian. Our main result shows that the "residual" of the model fit, by solving a non-linear least-square problem, follows a (possibly noncentral) $\chi^2$ distribution. The parameters of the $\chi^2$ distribution are related to the model order and dimension of the problem. We further present a method to select the model orders sequentially. We demonstrate the broad application of the general theory in machine learning and signal processing, including determining the rank of low-rank (possibly complex-valued) matrices and tensors from noisy, partial, or indirect observations, determining the number of sources in signal demixing, and potential applications in determining the number of hidden nodes in neural networks.

Results in Papers With Code
(↓ scroll down to see all results)