Search Results for author: Paris Perdikaris

Found 47 papers, 34 papers with code

Inferring solutions of differential equations using noisy multi-fidelity data

1 code implementation16 Jul 2016 Maziar Raissi, Paris Perdikaris, George Em. Karniadakis

For more than two centuries, solutions of differential equations have been obtained either analytically or numerically based on typically well-behaved forcing and boundary conditions for well-posed problems.

Active Learning

Numerical Gaussian Processes for Time-dependent and Non-linear Partial Differential Equations

1 code implementation29 Mar 2017 Maziar Raissi, Paris Perdikaris, George Em. Karniadakis

Numerical Gaussian processes, by construction, are designed to deal with cases where: (1) all we observe are noisy data on black-box initial conditions, and (2) we are interested in quantifying the uncertainty associated with such noisy data in our solutions to time-dependent partial differential equations.

Gaussian Processes

Physics Informed Deep Learning (Part II): Data-driven Discovery of Nonlinear Partial Differential Equations

23 code implementations28 Nov 2017 Maziar Raissi, Paris Perdikaris, George Em. Karniadakis

We introduce physics informed neural networks -- neural networks that are trained to solve supervised learning tasks while respecting any given law of physics described by general nonlinear partial differential equations.

Physics Informed Deep Learning (Part I): Data-driven Solutions of Nonlinear Partial Differential Equations

29 code implementations28 Nov 2017 Maziar Raissi, Paris Perdikaris, George Em. Karniadakis

We introduce physics informed neural networks -- neural networks that are trained to solve supervised learning tasks while respecting any given law of physics described by general nonlinear partial differential equations.

Multistep Neural Networks for Data-driven Discovery of Nonlinear Dynamical Systems

2 code implementations4 Jan 2018 Maziar Raissi, Paris Perdikaris, George Em. Karniadakis

The process of transforming observed data into predictive mathematical models of the physical world has always been paramount in science and engineering.

Learning Parameters and Constitutive Relationships with Physics Informed Deep Neural Networks

1 code implementation10 Aug 2018 Alexandre M. Tartakovsky, Carlos Ortiz Marrero, Paris Perdikaris, Guzel D. Tartakovsky, David Barajas-Solano

We employ physics informed DNNs to estimate the unknown space-dependent diffusion coefficient in a linear diffusion equation and an unknown constitutive relationship in a non-linear diffusion equation.

Analysis of PDEs Computational Physics

Adversarial Uncertainty Quantification in Physics-Informed Neural Networks

2 code implementations9 Nov 2018 Yibo Yang, Paris Perdikaris

We present a deep learning framework for quantifying and propagating uncertainty in systems governed by non-linear differential equations using physics-informed neural networks.

Uncertainty Quantification

Physics-informed deep generative models

no code implementations9 Dec 2018 Yibo Yang, Paris Perdikaris

We consider the application of deep generative models in propagating uncertainty through complex physical systems.

Variational Inference

Conditional deep surrogate models for stochastic, high-dimensional, and multi-fidelity systems

2 code implementations15 Jan 2019 Yibo Yang, Paris Perdikaris

We present a probabilistic deep learning methodology that enables the construction of predictive data-driven surrogates for stochastic systems.

Probabilistic Deep Learning Variational Inference

Physics-Constrained Deep Learning for High-dimensional Surrogate Modeling and Uncertainty Quantification without Labeled Data

1 code implementation18 Jan 2019 Yinhao Zhu, Nicholas Zabaras, Phaedon-Stelios Koutsourelakis, Paris Perdikaris

Surrogate modeling and uncertainty quantification tasks for PDE systems are most often considered as supervised learning problems where input and output data pairs are used for training.

Small Data Image Classification Uncertainty Quantification

A comparative study of physics-informed neural network models for learning unknown dynamics and constitutive relations

no code implementations2 Apr 2019 Ramakrishna Tipireddy, Paris Perdikaris, Panos Stinis, Alexandre Tartakovsky

We investigate the use of discrete and continuous versions of physics-informed neural network methods for learning unknown dynamics or constitutive relations of a dynamical system.

Multi-fidelity classification using Gaussian processes: accelerating the prediction of large-scale computational models

1 code implementation9 May 2019 Francisco Sahli Costabal, Paris Perdikaris, Ellen Kuhl, Daniel E. Hurtado

In an application to cardiac electrophysiology, the multi-fidelity classifier achieves an F1 score, the harmonic mean of precision and recall, of 99. 6% compared to 74. 1% of a single-fidelity classifier when both are trained with 50 samples.

Active Learning BIG-bench Machine Learning +2

Machine learning in cardiovascular flows modeling: Predicting arterial blood pressure from non-invasive 4D flow MRI data using physics-informed neural networks

1 code implementation13 May 2019 Georgios Kissas, Yibo Yang, Eileen Hwuang, Walter R. Witschey, John A. Detre, Paris Perdikaris

Such models can be nowadays deployed on large patient-specific topologies of systemic arterial networks and return detailed predictions on flow patterns, wall shear stresses, and pulse wave propagation.

Understanding and mitigating gradient pathologies in physics-informed neural networks

1 code implementation13 Jan 2020 Sifan Wang, Yujun Teng, Paris Perdikaris

The widespread use of neural networks across different scientific domains often involves constraining them to satisfy certain symmetries, conservation laws, or other domain knowledge.

Philosophy

Bayesian differential programming for robust systems identification under uncertainty

1 code implementation15 Apr 2020 Yibo Yang, Mohamed Aziz Bhouri, Paris Perdikaris

This paper presents a machine learning framework for Bayesian systems identification from noisy, sparse and irregular observations of nonlinear dynamical systems.

Bayesian Inference Model Discovery

Deep learning of free boundary and Stefan problems

1 code implementation4 Jun 2020 Sifan Wang, Paris Perdikaris

Free boundary problems appear naturally in numerous areas of mathematics, science and engineering.

When and why PINNs fail to train: A neural tangent kernel perspective

1 code implementation28 Jul 2020 Sifan Wang, Xinling Yu, Paris Perdikaris

In this work, we aim to investigate these questions through the lens of the Neural Tangent Kernel (NTK); a kernel that captures the behavior of fully-connected neural networks in the infinite width limit during training via gradient descent.

Learning Unknown Physics of non-Newtonian Fluids

no code implementations26 Aug 2020 Brandon Reyes, Amanda A. Howard, Paris Perdikaris, Alexandre M. Tartakovsky

Once a viscosity model is learned, we use the PINN method to solve the momentum conservation equation for non-Newtonian fluid flow using only the boundary conditions.

On the eigenvector bias of Fourier feature networks: From regression to solving multi-scale PDEs with physics-informed neural networks

1 code implementation18 Dec 2020 Sifan Wang, Hanwen Wang, Paris Perdikaris

Physics-informed neural networks (PINNs) are demonstrating remarkable promise in integrating physical models with gappy and noisy observational data, but they still struggle in cases where the target functions to be approximated exhibit high-frequency or multi-scale features.

Output-Weighted Sampling for Multi-Armed Bandits with Extreme Payoffs

1 code implementation19 Feb 2021 Yibo Yang, Antoine Blanchard, Themistoklis Sapsis, Paris Perdikaris

We present a new type of acquisition functions for online decision making in multi-armed and contextual bandit problems with extreme payoffs.

Decision Making Gaussian Processes +1

Learning atrial fiber orientations and conductivity tensors from intracardiac maps using physics-informed neural networks

no code implementations22 Feb 2021 Thomas Grandits, Simone Pezzuto, Francisco Sahli Costabal, Paris Perdikaris, Thomas Pock, Gernot Plank, Rolf Krause

In this work, we employ a recently developed approach, called physics informed neural networks, to learn the fiber orientations from electroanatomical maps, taking into account the physics of the electrical wave propagation.

Gaussian processes meet NeuralODEs: A Bayesian framework for learning the dynamics of partially observed systems from scarce and noisy data

1 code implementation4 Mar 2021 Mohamed Aziz Bhouri, Paris Perdikaris

This paper presents a machine learning framework (GP-NODE) for Bayesian systems identification from partial, noisy and irregular observations of nonlinear dynamical systems.

Bayesian Inference Gaussian Processes +1

Learning the solution operator of parametric partial differential equations with physics-informed DeepOnets

2 code implementations19 Mar 2021 Sifan Wang, Hanwen Wang, Paris Perdikaris

Deep operator networks (DeepONets) are receiving increased attention thanks to their demonstrated capability to approximate nonlinear operators between infinite-dimensional Banach spaces.

Long-time integration of parametric evolution equations with physics-informed DeepONets

1 code implementation9 Jun 2021 Sifan Wang, Paris Perdikaris

Ordinary and partial differential equations (ODEs/PDEs) play a paramount role in analyzing and simulating complex dynamic processes across all corners of science and engineering.

Enhancing the trainability and expressivity of deep MLPs with globally orthogonal initialization

no code implementations NeurIPS Workshop DLDE 2021 Hanwen Wang, Isabelle Crawford-Eng, Paris Perdikaris

Multilayer Perceptrons (MLPs) defines a fundamental model class that forms the backbone of many modern deep learning architectures.

Improved architectures and training algorithms for deep operator networks

1 code implementation4 Oct 2021 Sifan Wang, Hanwen Wang, Paris Perdikaris

In this work we analyze the training dynamics of deep operator networks (DeepONets) through the lens of Neural Tangent Kernel (NTK) theory, and reveal a bias that favors the approximation of functions with larger magnitudes.

Operator learning

Fast PDE-constrained optimization via self-supervised operator learning

1 code implementation25 Oct 2021 Sifan Wang, Mohamed Aziz Bhouri, Paris Perdikaris

Design and optimal control problems are among the fundamental, ubiquitous tasks we face in science and engineering.

Operator learning

Learning Operators with Coupled Attention

1 code implementation4 Jan 2022 Georgios Kissas, Jacob Seidman, Leonardo Ferreira Guilhoto, Victor M. Preciado, George J. Pappas, Paris Perdikaris

Supervised operator learning is an emerging machine learning paradigm with applications to modeling the evolution of spatio-temporal dynamical systems and approximating general black-box relationships between functional data.

Operator learning

Physics-informed neural networks to learn cardiac fiber orientation from multiple electroanatomical maps

1 code implementation28 Jan 2022 Carlos Ruiz Herrera, Thomas Grandits, Gernot Plank, Paris Perdikaris, Francisco Sahli Costabal, Simone Pezzuto

The inverse problem amounts to identifying the conduction velocity tensor of a cardiac propagation model from a set of sparse activation maps.

Scalable Uncertainty Quantification for Deep Operator Networks using Randomized Priors

1 code implementation6 Mar 2022 Yibo Yang, Georgios Kissas, Paris Perdikaris

Finally, we provide an optimized JAX library called {\em UQDeepONet} that can accommodate large model architectures, large ensemble sizes, as well as large data-sets with excellent parallel performance on accelerated hardware, thereby enabling uncertainty quantification for DeepONets in realistic large-scale applications.

Uncertainty Quantification

Respecting causality is all you need for training physics-informed neural networks

3 code implementations14 Mar 2022 Sifan Wang, Shyam Sankaran, Paris Perdikaris

While the popularity of physics-informed neural networks (PINNs) is steadily rising, to this date PINNs have not been successful in simulating dynamical systems whose solution exhibits multi-scale, chaotic or turbulent behavior.

Attribute

NOMAD: Nonlinear Manifold Decoders for Operator Learning

no code implementations7 Jun 2022 Jacob H. Seidman, Georgios Kissas, Paris Perdikaris, George J. Pappas

Supervised learning in function spaces is an emerging area of machine learning research with applications to the prediction of complex physical systems such as fluid flows, solid mechanics, and climate modeling.

Operator learning

Mitigating Propagation Failures in Physics-informed Neural Networks using Retain-Resample-Release (R3) Sampling

1 code implementation5 Jul 2022 Arka Daw, Jie Bu, Sifan Wang, Paris Perdikaris, Anuj Karpatne

In this paper, we provide a novel perspective of failure modes of PINNs by hypothesizing that training PINNs relies on successful "propagation" of solution from initial and/or boundary condition points to interior points.

Semi-supervised Invertible Neural Operators for Bayesian Inverse Problems

1 code implementation6 Sep 2022 Sebastian Kaltenbach, Paris Perdikaris, Phaedon-Stelios Koutsourelakis

Neural Operators offer a powerful, data-driven tool for solving parametric PDEs as they can represent maps between infinite-dimensional function spaces.

$Δ$-PINNs: physics-informed neural networks on complex geometries

1 code implementation8 Sep 2022 Francisco Sahli Costabal, Simone Pezzuto, Paris Perdikaris

We approximate the eigenfunctions as well as the operators involved in the partial differential equations with finite elements.

Random Weight Factorization Improves the Training of Continuous Neural Representations

1 code implementation3 Oct 2022 Sifan Wang, Hanwen Wang, Jacob H. Seidman, Paris Perdikaris

Continuous neural representations have recently emerged as a powerful and flexible alternative to classical discretized representations of signals.

Inverse Rendering

Scalable Bayesian optimization with high-dimensional outputs using randomized prior networks

1 code implementation14 Feb 2023 Mohamed Aziz Bhouri, Michael Joly, Robert Yu, Soumalya Sarkar, Paris Perdikaris

Several fundamental problems in science and engineering consist of global optimization tasks involving unknown high-dimensional (black-box) functions that map a set of controllable variables to the outcomes of an expensive experiment.

Bayesian Optimization Decision Making +1

Variational Autoencoding Neural Operators

no code implementations20 Feb 2023 Jacob H. Seidman, Georgios Kissas, George J. Pappas, Paris Perdikaris

Unsupervised learning with functional data is an emerging paradigm of machine learning research with applications to computer vision, climate modeling and physical systems.

Operator learning

Ensemble learning for Physics Informed Neural Networks: a Gradient Boosting approach

no code implementations25 Feb 2023 Zhiwei Fang, Sifan Wang, Paris Perdikaris

While the popularity of physics-informed neural networks (PINNs) is steadily rising, to this date, PINNs have not been successful in simulating multi-scale and singular perturbation problems.

Ensemble Learning

Gaussian Process Port-Hamiltonian Systems: Bayesian Learning with Physics Prior

no code implementations15 May 2023 Thomas Beckers, Jacob Seidman, Paris Perdikaris, George J. Pappas

Data-driven approaches achieve remarkable results for the modeling of complex dynamics based on collected data.

Uncertainty Quantification

PPDONet: Deep Operator Networks for Fast Prediction of Steady-State Solutions in Disk-Planet Systems

1 code implementation18 May 2023 Shunyuan Mao, Ruobing Dong, Lu Lu, Kwang Moo Yi, Sifan Wang, Paris Perdikaris

We develop a tool, which we name Protoplanetary Disk Operator Network (PPDONet), that can predict the solution of disk-planet interactions in protoplanetary disks in real-time.

An Expert's Guide to Training Physics-informed Neural Networks

1 code implementation16 Aug 2023 Sifan Wang, Shyam Sankaran, Hanwen Wang, Paris Perdikaris

Physics-informed neural networks (PINNs) have been popularized as a deep learning framework that can seamlessly synthesize observational data and partial differential equation (PDE) constraints.

Learning Only On Boundaries: a Physics-Informed Neural operator for Solving Parametric Partial Differential Equations in Complex Geometries

no code implementations24 Aug 2023 Zhiwei Fang, Sifan Wang, Paris Perdikaris

By reformulating the PDEs into boundary integral equations (BIEs), we can train the operator network solely on the boundary of the domain.

PirateNets: Physics-informed Deep Learning with Residual Adaptive Networks

1 code implementation1 Feb 2024 Sifan Wang, Bowen Li, Yuhan Chen, Paris Perdikaris

While physics-informed neural networks (PINNs) have become a popular deep learning framework for tackling forward and inverse problems governed by partial differential equations (PDEs), their performance is known to degrade when larger and deeper neural network architectures are employed.

Cannot find the paper you are looking for? You can Submit a new open access paper.