Search Results for author: Omar Ghattas

Found 16 papers, 12 papers with code

Efficient geometric Markov chain Monte Carlo for nonlinear Bayesian inversion enabled by derivative-informed neural operators

no code implementations13 Mar 2024 Lianghao Cao, Thomas O'Leary-Roseberry, Omar Ghattas

Furthermore, the training cost of DINO surrogates breaks even after collecting merely 10--25 effective posterior samples compared to geometric MCMC.

Operator learning

Efficient PDE-Constrained optimization under high-dimensional uncertainty using derivative-informed neural operators

1 code implementation31 May 2023 Dingcheng Luo, Thomas O'Leary-Roseberry, Peng Chen, Omar Ghattas

We propose a novel machine learning framework for solving optimization problems governed by large-scale partial differential equations (PDEs) with high-dimensional random parameters.

Residual-based error correction for neural operator accelerated infinite-dimensional Bayesian inverse problems

no code implementations6 Oct 2022 Lianghao Cao, Thomas O'Leary-Roseberry, Prashant K. Jha, J. Tinsley Oden, Omar Ghattas

We show that a trained neural operator with error correction can achieve a quadratic reduction of its approximation error, all while retaining substantial computational speedups of posterior sampling when models are governed by highly nonlinear PDEs.

Bayesian model calibration for block copolymer self-assembly: Likelihood-free inference and expected information gain computation via measure transport

no code implementations22 Jun 2022 Ricardo Baptista, Lianghao Cao, Joshua Chen, Omar Ghattas, Fengyi Li, Youssef M. Marzouk, J. Tinsley Oden

We tackle this challenging Bayesian inference problem using a likelihood-free approach based on measure transport together with the construction of summary statistics for the image data.

Bayesian Inference Informativeness

Derivative-Informed Neural Operator: An Efficient Framework for High-Dimensional Parametric Derivative Learning

1 code implementation21 Jun 2022 Thomas O'Leary-Roseberry, Peng Chen, Umberto Villa, Omar Ghattas

We propose derivative-informed neural operators (DINOs), a general family of neural networks to approximate operators as infinite-dimensional mappings from input function spaces to output function spaces or quantities of interest.

Dimensionality Reduction Experimental Design

A stochastic Stein Variational Newton method

1 code implementation19 Apr 2022 Alex Leviyev, Joshua Chen, Yifei Wang, Omar Ghattas, Aaron Zimmerman

Meanwhile, Stein variational Newton (SVN), a Newton-like extension of SVGD, dramatically accelerates the convergence of SVGD by incorporating Hessian information into the dynamics, but also produces biased samples.

Bayesian Inference

Learning High-Dimensional Parametric Maps via Reduced Basis Adaptive Residual Networks

2 code implementations14 Dec 2021 Thomas O'Leary-Roseberry, Xiaosong Du, Anirban Chaudhuri, Joaquim R. R. A. Martins, Karen Willcox, Omar Ghattas

We propose a scalable framework for the learning of high-dimensional parametric maps via adaptively constructed residual network (ResNet) maps between reduced bases of the inputs and outputs.

Experimental Design Vocal Bursts Intensity Prediction

An efficient method for goal-oriented linear Bayesian optimal experimental design: Application to optimal sensor placemen

1 code implementation12 Feb 2021 Keyi Wu, Peng Chen, Omar Ghattas

Optimal experimental design (OED) plays an important role in the problem of identifying uncertainty with limited experimental data.

Optimization and Control Numerical Analysis Numerical Analysis

Derivative-Informed Projected Neural Networks for High-Dimensional Parametric Maps Governed by PDEs

1 code implementation30 Nov 2020 Thomas O'Leary-Roseberry, Umberto Villa, Peng Chen, Omar Ghattas

We use the projection basis vectors in the active subspace as well as the principal output subspace to construct the weights for the first and last layers of the neural network, respectively.

Experimental Design Uncertainty Quantification

Tensor train construction from tensor actions, with application to compression of large high order derivative tensors

1 code implementation14 Feb 2020 Nick Alger, Peng Chen, Omar Ghattas

We present a method for converting tensors into tensor train format based on actions of the tensor as a vector-valued multilinear function.

Numerical Analysis Numerical Analysis

Projected Stein Variational Gradient Descent

1 code implementation NeurIPS 2020 Peng Chen, Omar Ghattas

The curse of dimensionality is a longstanding challenge in Bayesian inference in high dimensions.

Bayesian Inference

Ill-Posedness and Optimization Geometry for Nonlinear Neural Network Training

no code implementations7 Feb 2020 Thomas O'Leary-Roseberry, Omar Ghattas

We show that the nonlinear activation functions used in the network construction play a critical role in classifying stationary points of the loss landscape.

Low Rank Saddle Free Newton: A Scalable Method for Stochastic Nonconvex Optimization

2 code implementations7 Feb 2020 Thomas O'Leary-Roseberry, Nick Alger, Omar Ghattas

In this work we motivate the extension of Newton methods to the SA regime, and argue for the use of the scalable low rank saddle free Newton (LRSFN) method, which avoids forming the Hessian in favor of making a low rank approximation.

Second-order methods

Disentangled behavioural representations

1 code implementation NeurIPS 2019 Amir Dezfouli, Hassan Ashtiani, Omar Ghattas, Richard Nock, Peter Dayan, Cheng Soon Ong

Individual characteristics in human decision-making are often quantified by fitting a parametric cognitive model to subjects' behavior and then studying differences between them in the associated parameter space.

Decision Making

Inexact Newton Methods for Stochastic Non-Convex Optimization with Applications to Neural Network Training

1 code implementation16 May 2019 Thomas O'Leary-Roseberry, Nick Alger, Omar Ghattas

We survey sub-sampled inexact Newton methods and consider their application in non-convex settings.

Optimization and Control Numerical Analysis

Cannot find the paper you are looking for? You can Submit a new open access paper.