Search Results for author: Wenjing Liao

Found 19 papers, 2 papers with code

Generalization Error Guaranteed Auto-Encoder-Based Nonlinear Model Reduction for Operator Learning

no code implementations19 Jan 2024 Hao liu, Biraj Dahal, Rongjie Lai, Wenjing Liao

The problem of operator learning, in this context, seeks to extract these physical processes from empirical data, which is challenging due to the infinite or high dimensionality of data.

Operator learning

Effective Minkowski Dimension of Deep Nonparametric Regression: Function Approximation and Statistical Theories

no code implementations26 Jun 2023 Zixuan Zhang, Minshuo Chen, Mengdi Wang, Wenjing Liao, Tuo Zhao

Existing theories on deep nonparametric regression have shown that when the input data lie on a low-dimensional manifold, deep neural networks can adapt to the intrinsic data structures.

regression

Deep Nonparametric Estimation of Intrinsic Data Structures by Chart Autoencoders: Generalization Error and Robustness

no code implementations17 Mar 2023 Hao liu, Alex Havrilla, Rongjie Lai, Wenjing Liao

Our paper establishes statistical guarantees on the generalization error of chart autoencoders, and we demonstrate their denoising capabilities by considering $n$ noisy training samples, along with their noise-free counterparts, on a $d$-dimensional manifold.

Denoising

On Deep Generative Models for Approximation and Estimation of Distributions on Manifolds

no code implementations25 Feb 2023 Biraj Dahal, Alex Havrilla, Minshuo Chen, Tuo Zhao, Wenjing Liao

Many existing experiments have demonstrated that generative networks can generate high-dimensional complex data from a low-dimensional easy-to-sample distribution.

WeakIdent: Weak formulation for Identifying Differential Equations using Narrow-fit and Trimming

1 code implementation6 Nov 2022 Mengyi Tang, Wenjing Liao, Rachel Kuske, Sung Ha Kang

We propose a general and robust framework to recover differential equations using a weak formulation, for both ordinary and partial differential equations (ODEs and PDEs).

Denoising

Benefits of Overparameterized Convolutional Residual Networks: Function Approximation under Smoothness Constraint

no code implementations9 Jun 2022 Hao liu, Minshuo Chen, Siawpeng Er, Wenjing Liao, Tong Zhang, Tuo Zhao

Overparameterized neural networks enjoy great representation power on complex data, and more importantly yield sufficiently smooth output, which is crucial to their generalization and robustness.

Image Classification

A Manifold Two-Sample Test Study: Integral Probability Metric with Neural Networks

no code implementations4 May 2022 Jie Wang, Minshuo Chen, Tuo Zhao, Wenjing Liao, Yao Xie

Based on the approximation theory of neural networks, we show that the neural network IPM test has the type-II risk in the order of $n^{-(s+\beta)/d}$, which is in the same order of the type-II risk as the H\"older IPM test.

Deep Nonparametric Estimation of Operators between Infinite Dimensional Spaces

no code implementations1 Jan 2022 Hao liu, Haizhao Yang, Minshuo Chen, Tuo Zhao, Wenjing Liao

Learning operators between infinitely dimensional spaces is an important learning task arising in wide applications in machine learning, imaging science, mathematical modeling and simulations, etc.

Besov Function Approximation and Binary Classification on Low-Dimensional Manifolds Using Convolutional Residual Networks

no code implementations7 Sep 2021 Hao liu, Minshuo Chen, Tuo Zhao, Wenjing Liao

Most of existing statistical theories on deep neural networks have sample complexities cursed by the data dimension and therefore cannot well explain the empirical success of deep learning on high-dimensional data.

Binary Classification

Multiscale regression on unknown manifolds

no code implementations13 Jan 2021 Wenjing Liao, Mauro Maggioni, Stefano Vigogna

We consider the regression problem of estimating functions on $\mathbb{R}^D$ but supported on a $d$-dimensional manifold $ \mathcal{M} \subset \mathbb{R}^D $ with $ d \ll D $.

regression

Doubly Robust Off-Policy Learning on Low-Dimensional Manifolds by Deep Neural Networks

no code implementations3 Nov 2020 Minshuo Chen, Hao liu, Wenjing Liao, Tuo Zhao

Our theory shows that deep neural networks are adaptive to the low-dimensional geometric structures of the covariates, and partially explains the success of deep learning for causal inference.

Causal Inference

Distribution Approximation and Statistical Estimation Guarantees of Generative Adversarial Networks

no code implementations10 Feb 2020 Minshuo Chen, Wenjing Liao, Hongyuan Zha, Tuo Zhao

Generative Adversarial Networks (GANs) have achieved a great success in unsupervised learning.

Learning functions varying along a central subspace

no code implementations22 Jan 2020 Hao Liu, Wenjing Liao

The estimation error of this variance quantity is also given in this paper.

Open-Ended Question Answering regression

Efficient Approximation of Deep ReLU Networks for Functions on Low Dimensional Manifolds

no code implementations NeurIPS 2019 Minshuo Chen, Haoming Jiang, Wenjing Liao, Tuo Zhao

The network size scales exponentially in the approximation error, with an exponent depending on the intrinsic dimension of the data and the smoothness of the function.

Nonparametric Regression on Low-Dimensional Manifolds using Deep ReLU Networks : Function Approximation and Statistical Recovery

no code implementations NeurIPS 2019 Minshuo Chen, Haoming Jiang, Wenjing Liao, Tuo Zhao

It therefore demonstrates the adaptivity of deep ReLU networks to low-dimensional geometric structures of data, and partially explains the power of deep ReLU networks in tackling high-dimensional data with low-dimensional geometric structures.

regression

IDENT: Identifying Differential Equations with Numerical Time evolution

no code implementations6 Apr 2019 Sung Ha Kang, Wenjing Liao, Yingjie Liu

The new algorithm, called Identifying Differential Equations with Numerical Time evolution (IDENT), is explored for data with non-periodic boundary conditions, noisy data and PDEs with varying coefficients.

Numerical Analysis

Adaptive Geometric Multiscale Approximations for Intrinsically Low-dimensional Data

no code implementations3 Nov 2016 Wenjing Liao, Mauro Maggioni

We consider the problem of efficiently approximating and encoding high-dimensional data sampled from a probability distribution $\rho$ in $\mathbb{R}^D$, that is nearly supported on a $d$-dimensional set $\mathcal{M}$ - for example supported on a $d$-dimensional Riemannian manifold.

Dictionary Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.