Search Results for author: Nicholas Zabaras

Found 21 papers, 15 papers with code

Deep Learning for Simultaneous Inference of Hydraulic and Transport Properties

no code implementations24 Oct 2021 Zitong Zhou, Nicholas Zabaras, Daniel M. Tartakovsky

We use a convolutional adversarial autoencoder (CAAE) for the parameterization of the heterogeneous non-Gaussian conductivity field with a low-dimensional latent representation.

Computational Efficiency

A Bayesian Multiscale Deep Learning Framework for Flows in Random Media

1 code implementation8 Mar 2021 Govinda Anantha Padmanabha, Nicholas Zabaras

In addition, it is challenging to develop accurate surrogate and uncertainty quantification models for high-dimensional problems governed by stochastic multiscale PDEs using limited training data.

Uncertainty Quantification

Bayesian multiscale deep generative model for the solution of high-dimensional inverse problems

2 code implementations4 Feb 2021 Yingzhi Xia, Nicholas Zabaras

In this way, the global features are identified in the coarse-scale with inference of low-dimensional variables and inexpensive forward computation, and the local features are refined and corrected in the fine-scale.

Bayesian Inference

Transformers for Modeling Physical Systems

2 code implementations4 Oct 2020 Nicholas Geneva, Nicholas Zabaras

Transformers are widely used in natural language processing due to their ability to model longer-term dependencies in text.

Physics-Constrained Predictive Molecular Latent Space Discovery with Graph Scattering Variational Autoencoder

1 code implementation29 Sep 2020 Navid Shervani-Tabar, Nicholas Zabaras

In this work, we assess the predictive capabilities of a molecular generative model developed based on variational inference and graph theory in the small data regime.

Drug Discovery molecular representation +1

Solving inverse problems using conditional invertible neural networks

1 code implementation31 Jul 2020 Govinda Anantha Padmanabha, Nicholas Zabaras

In this work, we construct a two- and three-dimensional inverse surrogate models consisting of an invertible and a conditional neural network trained in an end-to-end fashion with limited training data.

Multi-fidelity Generative Deep Learning Turbulent Flows

1 code implementation8 Jun 2020 Nicholas Geneva, Nicholas Zabaras

The resulting surrogate is able to generate physically accurate turbulent realizations at a computational cost magnitudes lower than that of a high-fidelity simulation.

Embedded-physics machine learning for coarse-graining and collective variable discovery without data

no code implementations24 Feb 2020 Markus Schöberl, Nicholas Zabaras, Phaedon-Stelios Koutsourelakis

Rather than separating model learning from the data-generation procedure - the latter relies on simulating atomistic motions governed by force fields - we query the atomistic force field at sample configurations proposed by the predictive coarse-grained model.

BIG-bench Machine Learning

Integration of adversarial autoencoders with residual dense convolutional networks for estimation of non-Gaussian hydraulic conductivities

1 code implementation26 Jun 2019 Shaoxing Mo, Nicholas Zabaras, Xiaoqing Shi, Jichun Wu

In addition, a deep residual dense convolutional network (DRDCN) is proposed for surrogate modeling of forward models with high-dimensional and highly-complex mappings.

Modeling the Dynamics of PDE Systems with Physics-Constrained Deep Auto-Regressive Networks

1 code implementation13 Jun 2019 Nicholas Geneva, Nicholas Zabaras

In recent years, deep learning has proven to be a viable methodology for surrogate modeling and uncertainty quantification for a vast number of physical systems.

Uncertainty Quantification

Physics-Constrained Deep Learning for High-dimensional Surrogate Modeling and Uncertainty Quantification without Labeled Data

1 code implementation18 Jan 2019 Yinhao Zhu, Nicholas Zabaras, Phaedon-Stelios Koutsourelakis, Paris Perdikaris

Surrogate modeling and uncertainty quantification tasks for PDE systems are most often considered as supervised learning problems where input and output data pairs are used for training.

Small Data Image Classification Uncertainty Quantification

Deep autoregressive neural networks for high-dimensional inverse problems in groundwater contaminant source identification

1 code implementation22 Dec 2018 Shaoxing Mo, Nicholas Zabaras, Xiaoqing Shi, Jichun Wu

Results indicate that, with relatively limited training data, the deep autoregressive neural network consisting of 27 convolutional layers is capable of providing an accurate approximation for the high-dimensional model input-output relationship.

Computational Efficiency

Predictive Collective Variable Discovery with Deep Bayesian Models

1 code implementation18 Sep 2018 Markus Schöberl, Nicholas Zabaras, Phaedon-Stelios Koutsourelakis

In this work, we formulate the discovery of CVs as a Bayesian inference problem and consider the CVs as hidden generators of the full-atomistic trajectory.

Bayesian Inference Variational Inference

Structured Bayesian Gaussian process latent variable model: applications to data-driven dimensionality reduction and high-dimensional inversion

1 code implementation11 Jul 2018 Steven Atkinson, Nicholas Zabaras

A structured Bayesian Gaussian process latent variable model is used both to construct a low-dimensional generative model of the sample-based stochastic prior as well as a surrogate for the forward evaluation.

Dimensionality Reduction

Quantifying model form uncertainty in Reynolds-averaged turbulence models with Bayesian deep neural networks

1 code implementation8 Jul 2018 Nicholas Geneva, Nicholas Zabaras

Uncertainty quantification for such data-driven models is essential since their predictive capability rapidly declines as they are tested for flow physics that deviate from that in the training data.

Uncertainty Quantification

Deep convolutional encoder-decoder networks for uncertainty quantification of dynamic multiphase flow in heterogeneous media

1 code implementation2 Jul 2018 Shaoxing Mo, Yinhao Zhu, Nicholas Zabaras, Xiaoqing Shi, Jichun Wu

A training strategy combining a regression loss and a segmentation loss is proposed in order to better approximate the discontinuous saturation field.

Computational Efficiency regression +1

Structured Bayesian Gaussian process latent variable model

no code implementations22 May 2018 Steven Atkinson, Nicholas Zabaras

We introduce a Bayesian Gaussian process latent variable model that explicitly captures spatial correlations in data using a parameterized spatial kernel and leveraging structure-exploiting algebra on the model covariance matrices for computational tractability.

Imputation Super-Resolution +2

Bayesian Deep Convolutional Encoder-Decoder Networks for Surrogate Modeling and Uncertainty Quantification

no code implementations21 Jan 2018 Yinhao Zhu, Nicholas Zabaras

We are interested in the development of surrogate models for uncertainty quantification and propagation in problems governed by stochastic PDEs using a deep convolutional encoder-decoder network in a similar fashion to approaches considered in deep learning for image-to-image regression tasks.

Bayesian Inference Gaussian Processes +2

Predictive Coarse-Graining

no code implementations26 May 2016 Markus Schöberl, Nicholas Zabaras, Phaedon-Stelios Koutsourelakis

We propose a data-driven, coarse-graining formulation in the context of equilibrium statistical mechanics.

Model Selection

Cannot find the paper you are looking for? You can Submit a new open access paper.