Search Results for author: Michael D. Shields

Found 19 papers, 6 papers with code

Physics-constrained polynomial chaos expansion for scientific machine learning and uncertainty quantification

no code implementations23 Feb 2024 Himanshu Sharma, Lukáš Novák, Michael D. Shields

We present a novel physics-constrained polynomial chaos expansion as a surrogate modeling method capable of performing both scientific machine learning (SciML) and uncertainty quantification (UQ) tasks.

Uncertainty Quantification

Polynomial Chaos Expansions on Principal Geodesic Grassmannian Submanifolds for Surrogate Modeling and Uncertainty Quantification

no code implementations30 Jan 2024 Dimitris G. Giovanis, Dimitrios Loukrezis, Ioannis G. Kevrekidis, Michael D. Shields

To this end, we employ Principal Geodesic Analysis on the Grassmann manifold of the response to identify a set of disjoint principal geodesic submanifolds, of possibly different dimension, that captures the variation in the data.

Uncertainty Quantification

Physics-Informed Polynomial Chaos Expansions

no code implementations4 Sep 2023 Lukáš Novák, Himanshu Sharma, Michael D. Shields

This paper presents a novel methodology for the construction of physics-informed polynomial chaos expansions (PCE) that combines the conventional experimental design with additional constraints from the physics of the model.

Experimental Design Uncertainty Quantification

On Active Learning for Gaussian Process-based Global Sensitivity Analysis

no code implementations27 Aug 2023 Mohit Chauhan, Mariel Ojeda-Tuz, Ryan Catarelli, Kurtis Gurley, Dimitrios Tsapetis, Michael D. Shields

We propose a novel strategy for active learning that focuses on resolving the main effects of the Gaussian process (associated with the numerator of the Sobol index) and compare this with existing strategies based on convergence in the total variance (the denominator of the Sobol index).

Active Learning Experimental Design

Learning thermodynamically constrained equations of state with uncertainty

no code implementations29 Jun 2023 Himanshu Sharma, Jim A. Gaffney, Dimitrios Tsapetis, Michael D. Shields

Since there are inherent uncertainties in the calibration data (parametric uncertainty) and the assumed functional EOS form (model uncertainty), it is essential to perform uncertainty quantification (UQ) to improve confidence in the EOS predictions.

GPR Uncertainty Quantification

Learning in latent spaces improves the predictive accuracy of deep neural operators

1 code implementation15 Apr 2023 Katiana Kontolati, Somdatta Goswami, George Em Karniadakis, Michael D. Shields

Operator regression provides a powerful means of constructing discretization-invariant emulators for partial-differential equations (PDEs) describing physical systems.

Computational Efficiency

Active Learning-based Domain Adaptive Localized Polynomial Chaos Expansion

no code implementations31 Jan 2023 Lukáš Novák, Michael D. Shields, Václav Sadílek, Miroslav Vořechovský

The numerical results show the superiority of the DAL-PCE in comparison to (i) a single global polynomial chaos expansion and (ii) the recently proposed stochastic spectral embedding (SSE) method developed as an accurate surrogate model and which is based on a similar domain decomposition process.

Active Learning

General multi-fidelity surrogate models: Framework and active learning strategies for efficient rare event simulation

no code implementations7 Dec 2022 Promit Chakroborty, Somayajulu L. N. Dhulipala, Yifeng Che, Wen Jiang, Benjamin W. Spencer, Jason D. Hales, Michael D. Shields

The multi-fidelity surrogate is assembled by first applying a Gaussian process correction to each low-fidelity model and assigning a model probability based on the model's local predictive accuracy and cost.

Active Learning Model Selection

Bayesian Inference with Latent Hamiltonian Neural Networks

1 code implementation12 Aug 2022 Somayajulu L. N. Dhulipala, Yifeng Che, Michael D. Shields

Compared to traditional NUTS, L-HNNs in NUTS with online error monitoring required 1--2 orders of magnitude fewer numerical gradients of the target density and improved the effective sample size (ESS) per gradient by an order of magnitude.

Bayesian Inference

Deep transfer operator learning for partial differential equations under conditional shift

1 code implementation20 Apr 2022 Somdatta Goswami, Katiana Kontolati, Michael D. Shields, George Em Karniadakis

Transfer learning (TL) enables the transfer of knowledge gained in learning to perform one task (source) to a related but different task (target), hence addressing the expense of data acquisition and labeling, potential computational power limitations, and dataset distribution mismatches.

Domain Adaptation Operator learning +4

On the influence of over-parameterization in manifold based surrogates and deep neural operators

1 code implementation9 Mar 2022 Katiana Kontolati, Somdatta Goswami, Michael D. Shields, George Em Karniadakis

In contrast, an even highly over-parameterized DeepONet leads to better generalization for both smooth and non-smooth dynamics.

Operator learning

A survey of unsupervised learning methods for high-dimensional uncertainty quantification in black-box-type problems

no code implementations9 Feb 2022 Katiana Kontolati, Dimitrios Loukrezis, Dimitris G. Giovanis, Lohit Vandanapu, Michael D. Shields

Constructing surrogate models for uncertainty quantification (UQ) on complex partial differential equations (PDEs) having inherently high-dimensional $\mathcal{O}(10^{\ge 2})$ stochastic inputs (e. g., forcing terms, boundary conditions, initial conditions) poses tremendous challenges.

blind source separation Dimensionality Reduction +1

Data-driven Uncertainty Quantification in Computational Human Head Models

no code implementations29 Oct 2021 Kshitiz Upadhyay, Dimitris G. Giovanis, Ahmed Alshareef, Andrew K. Knutsen, Curtis L. Johnson, Aaron Carass, Philip V. Bayly, Michael D. Shields, K. T. Ramesh

This framework is demonstrated on a 2D subject-specific head model, where the goal is to quantify uncertainty in the simulated strain fields (i. e., output), given variability in the material properties of different brain substructures (i. e., input).

Density Estimation Dimensionality Reduction +1

Grassmannian diffusion maps based surrogate modeling via geometric harmonics

no code implementations28 Sep 2021 Ketson R. M. dos Santos, Dimitrios G. Giovanis, Katiana Kontolati, Dimitrios Loukrezis, Michael D. Shields

Using this representation, geometric harmonics, an out-of-sample function extension technique, is employed to create a global map from the space of input parameters to a Grassmannian diffusion manifold.

Uncertainty Quantification

Manifold learning-based polynomial chaos expansions for high-dimensional surrogate models

2 code implementations21 Jul 2021 Katiana Kontolati, Dimitrios Loukrezis, Ketson R. M. dos Santos, Dimitrios G. Giovanis, Michael D. Shields

For this purpose, we employ Grassmannian diffusion maps, a two-step nonlinear dimension reduction technique which allows us to reduce the dimensionality of the data and identify meaningful geometric descriptions in a parsimonious and inexpensive manner.

Dimensionality Reduction Uncertainty Quantification +1

Probabilistic modeling of discrete structural response with application to composite plate penetration models

no code implementations23 Nov 2020 Anindya Bhaduri, Christopher S. Meyer, John W. Gillespie Jr., Bazle Z. Haque, Michael D. Shields, Lori Graham-Brady

This enables the computationally feasible generation of the probabilistic velocity response (PVR) curve or the $V_0-V_{100}$ curve as a function of the impact velocity, and the ballistic limit velocity prediction as a function of the model parameters.

Cannot find the paper you are looking for? You can Submit a new open access paper.