Search Results for author: Ling Guo

Found 10 papers, 3 papers with code

Energy based diffusion generator for efficient sampling of Boltzmann distributions

no code implementations4 Jan 2024 Yan Wang, Ling Guo, Hao Wu, Tao Zhou

We introduce a novel sampler called the energy based diffusion generator for generating samples from arbitrary target distributions.

IB-UQ: Information bottleneck based uncertainty quantification for neural function regression and neural operator learning

no code implementations7 Feb 2023 Ling Guo, Hao Wu, Wenwen Zhou, Yan Wang, Tao Zhou

We propose a novel framework for uncertainty quantification via information bottleneck (IB-UQ) for scientific machine learning tasks, including deep neural network (DNN) regression and neural operator learning (DeepONet).

Data Augmentation Operator learning +2

Monte Carlo PINNs: deep learning approach for forward and inverse problems involving high dimensional fractional partial differential equations

no code implementations16 Mar 2022 Ling Guo, Hao Wu, Xiaochen Yu, Tao Zhou

We introduce a sampling based machine learning approach, Monte Carlo physics informed neural networks (MC-PINNs), for solving forward and inverse fractional partial differential equations (FPDEs).

Uncertainty Quantification in Scientific Machine Learning: Methods, Metrics, and Comparisons

1 code implementation19 Jan 2022 Apostolos F Psaros, Xuhui Meng, Zongren Zou, Ling Guo, George Em Karniadakis

Neural networks (NNs) are currently changing the computational paradigm on how to combine data with mathematical laws in physics and engineering in a profound way, tackling challenging inverse and ill-posed problems not solvable with traditional methods.

BIG-bench Machine Learning Uncertainty Quantification

Normalizing field flows: Solving forward and inverse stochastic differential equations using physics-informed flow models

no code implementations30 Aug 2021 Ling Guo, Hao Wu, Tao Zhou

We introduce in this work the normalizing field flows (NFF) for learning random fields from scattered measurements.

Gaussian Processes

Frivolous Units: Wider Networks Are Not Really That Wide

1 code implementation10 Dec 2019 Stephen Casper, Xavier Boix, Vanessa D'Amario, Ling Guo, Martin Schrimpf, Kasper Vinken, Gabriel Kreiman

We identify two distinct types of "frivolous" units that proliferate when the network's width is increased: prunable units which can be dropped out of the network without significant change to the output and redundant units whose activities can be expressed as a linear combination of others.

Learning in Modal Space: Solving Time-Dependent Stochastic PDEs Using Physics-Informed Neural Networks

no code implementations3 May 2019 Dongkun Zhang, Ling Guo, George Em. Karniadakis

One of the open problems in scientific computing is the long-time integration of nonlinear stochastic partial differential equations (SPDEs).

Quantifying total uncertainty in physics-informed neural networks for solving forward and inverse stochastic problems

no code implementations21 Sep 2018 Dongkun Zhang, Lu Lu, Ling Guo, George Em. Karniadakis

Here, we propose a new method with the objective of endowing the DNN with uncertainty quantification for both sources of uncertainty, i. e., the parametric uncertainty and the approximation uncertainty.

Active Learning Uncertainty Quantification

Cannot find the paper you are looking for? You can Submit a new open access paper.