1 code implementation • 16 Jul 2016 • Maziar Raissi, Paris Perdikaris, George Em. Karniadakis
For more than two centuries, solutions of differential equations have been obtained either analytically or numerically based on typically well-behaved forcing and boundary conditions for well-posed problems.
1 code implementation • 29 Mar 2017 • Maziar Raissi, Paris Perdikaris, George Em. Karniadakis
Numerical Gaussian processes, by construction, are designed to deal with cases where: (1) all we observe are noisy data on black-box initial conditions, and (2) we are interested in quantifying the uncertainty associated with such noisy data in our solutions to time-dependent partial differential equations.
23 code implementations • 28 Nov 2017 • Maziar Raissi, Paris Perdikaris, George Em. Karniadakis
We introduce physics informed neural networks -- neural networks that are trained to solve supervised learning tasks while respecting any given law of physics described by general nonlinear partial differential equations.
29 code implementations • 28 Nov 2017 • Maziar Raissi, Paris Perdikaris, George Em. Karniadakis
We introduce physics informed neural networks -- neural networks that are trained to solve supervised learning tasks while respecting any given law of physics described by general nonlinear partial differential equations.
2 code implementations • 4 Jan 2018 • Maziar Raissi, Paris Perdikaris, George Em. Karniadakis
The process of transforming observed data into predictive mathematical models of the physical world has always been paramount in science and engineering.
no code implementations • 2 Aug 2018 • Mamikon Gulian, Maziar Raissi, Paris Perdikaris, George Karniadakis
We extend this framework to linear space-fractional differential equations.
1 code implementation • 10 Aug 2018 • Alexandre M. Tartakovsky, Carlos Ortiz Marrero, Paris Perdikaris, Guzel D. Tartakovsky, David Barajas-Solano
We employ physics informed DNNs to estimate the unknown space-dependent diffusion coefficient in a linear diffusion equation and an unknown constitutive relationship in a non-linear diffusion equation.
Analysis of PDEs Computational Physics
2 code implementations • 9 Nov 2018 • Yibo Yang, Paris Perdikaris
We present a deep learning framework for quantifying and propagating uncertainty in systems governed by non-linear differential equations using physics-informed neural networks.
no code implementations • 9 Dec 2018 • Yibo Yang, Paris Perdikaris
We consider the application of deep generative models in propagating uncertainty through complex physical systems.
2 code implementations • 15 Jan 2019 • Yibo Yang, Paris Perdikaris
We present a probabilistic deep learning methodology that enables the construction of predictive data-driven surrogates for stochastic systems.
1 code implementation • 18 Jan 2019 • Yinhao Zhu, Nicholas Zabaras, Phaedon-Stelios Koutsourelakis, Paris Perdikaris
Surrogate modeling and uncertainty quantification tasks for PDE systems are most often considered as supervised learning problems where input and output data pairs are used for training.
no code implementations • 2 Apr 2019 • Ramakrishna Tipireddy, Paris Perdikaris, Panos Stinis, Alexandre Tartakovsky
We investigate the use of discrete and continuous versions of physics-informed neural network methods for learning unknown dynamics or constitutive relations of a dynamical system.
1 code implementation • 9 May 2019 • Francisco Sahli Costabal, Paris Perdikaris, Ellen Kuhl, Daniel E. Hurtado
In an application to cardiac electrophysiology, the multi-fidelity classifier achieves an F1 score, the harmonic mean of precision and recall, of 99. 6% compared to 74. 1% of a single-fidelity classifier when both are trained with 50 samples.
1 code implementation • 13 May 2019 • Georgios Kissas, Yibo Yang, Eileen Hwuang, Walter R. Witschey, John A. Detre, Paris Perdikaris
Such models can be nowadays deployed on large patient-specific topologies of systemic arterial networks and return detailed predictions on flow patterns, wall shear stresses, and pulse wave propagation.
1 code implementation • 13 Jan 2020 • Sifan Wang, Yujun Teng, Paris Perdikaris
The widespread use of neural networks across different scientific domains often involves constraining them to satisfy certain symmetries, conservation laws, or other domain knowledge.
1 code implementation • 15 Apr 2020 • Yibo Yang, Mohamed Aziz Bhouri, Paris Perdikaris
This paper presents a machine learning framework for Bayesian systems identification from noisy, sparse and irregular observations of nonlinear dynamical systems.
1 code implementation • 4 Jun 2020 • Sifan Wang, Paris Perdikaris
Free boundary problems appear naturally in numerous areas of mathematics, science and engineering.
1 code implementation • 28 Jul 2020 • Sifan Wang, Xinling Yu, Paris Perdikaris
In this work, we aim to investigate these questions through the lens of the Neural Tangent Kernel (NTK); a kernel that captures the behavior of fully-connected neural networks in the infinite width limit during training via gradient descent.
no code implementations • 26 Aug 2020 • Brandon Reyes, Amanda A. Howard, Paris Perdikaris, Alexandre M. Tartakovsky
Once a viscosity model is learned, we use the PINN method to solve the momentum conservation equation for non-Newtonian fluid flow using only the boundary conditions.
1 code implementation • 18 Dec 2020 • Sifan Wang, Hanwen Wang, Paris Perdikaris
Physics-informed neural networks (PINNs) are demonstrating remarkable promise in integrating physical models with gappy and noisy observational data, but they still struggle in cases where the target functions to be approximated exhibit high-frequency or multi-scale features.
1 code implementation • 19 Feb 2021 • Yibo Yang, Antoine Blanchard, Themistoklis Sapsis, Paris Perdikaris
We present a new type of acquisition functions for online decision making in multi-armed and contextual bandit problems with extreme payoffs.
no code implementations • 22 Feb 2021 • Thomas Grandits, Simone Pezzuto, Francisco Sahli Costabal, Paris Perdikaris, Thomas Pock, Gernot Plank, Rolf Krause
In this work, we employ a recently developed approach, called physics informed neural networks, to learn the fiber orientations from electroanatomical maps, taking into account the physics of the electrical wave propagation.
1 code implementation • 4 Mar 2021 • Mohamed Aziz Bhouri, Paris Perdikaris
This paper presents a machine learning framework (GP-NODE) for Bayesian systems identification from partial, noisy and irregular observations of nonlinear dynamical systems.
2 code implementations • 19 Mar 2021 • Sifan Wang, Hanwen Wang, Paris Perdikaris
Deep operator networks (DeepONets) are receiving increased attention thanks to their demonstrated capability to approximate nonlinear operators between infinite-dimensional Banach spaces.
1 code implementation • 9 Jun 2021 • Sifan Wang, Paris Perdikaris
Ordinary and partial differential equations (ODEs/PDEs) play a paramount role in analyzing and simulating complex dynamic processes across all corners of science and engineering.
no code implementations • NeurIPS Workshop DLDE 2021 • Hanwen Wang, Isabelle Crawford-Eng, Paris Perdikaris
Multilayer Perceptrons (MLPs) defines a fundamental model class that forms the backbone of many modern deep learning architectures.
1 code implementation • 4 Oct 2021 • Sifan Wang, Hanwen Wang, Paris Perdikaris
In this work we analyze the training dynamics of deep operator networks (DeepONets) through the lens of Neural Tangent Kernel (NTK) theory, and reveal a bias that favors the approximation of functions with larger magnitudes.
1 code implementation • 25 Oct 2021 • Sifan Wang, Mohamed Aziz Bhouri, Paris Perdikaris
Design and optimal control problems are among the fundamental, ubiquitous tasks we face in science and engineering.
no code implementations • 15 Dec 2021 • Lia Gander, Simone Pezzuto, Ali Gharaviri, Rolf Krause, Paris Perdikaris, Francisco Sahli Costabal
Computational models of atrial fibrillation have successfully been used to predict optimal ablation sites.
1 code implementation • 4 Jan 2022 • Georgios Kissas, Jacob Seidman, Leonardo Ferreira Guilhoto, Victor M. Preciado, George J. Pappas, Paris Perdikaris
Supervised operator learning is an emerging machine learning paradigm with applications to modeling the evolution of spatio-temporal dynamical systems and approximating general black-box relationships between functional data.
1 code implementation • 28 Jan 2022 • Carlos Ruiz Herrera, Thomas Grandits, Gernot Plank, Paris Perdikaris, Francisco Sahli Costabal, Simone Pezzuto
The inverse problem amounts to identifying the conduction velocity tensor of a cardiac propagation model from a set of sparse activation maps.
1 code implementation • 6 Mar 2022 • Yibo Yang, Georgios Kissas, Paris Perdikaris
Finally, we provide an optimized JAX library called {\em UQDeepONet} that can accommodate large model architectures, large ensemble sizes, as well as large data-sets with excellent parallel performance on accelerated hardware, thereby enabling uncertainty quantification for DeepONets in realistic large-scale applications.
no code implementations • 11 Mar 2022 • Simone Pezzuto, Paris Perdikaris, Francisco Sahli Costabal
We propose a method for identifying an ectopic activation in the heart non-invasively.
3 code implementations • 14 Mar 2022 • Sifan Wang, Shyam Sankaran, Paris Perdikaris
While the popularity of physics-informed neural networks (PINNs) is steadily rising, to this date PINNs have not been successful in simulating dynamical systems whose solution exhibits multi-scale, chaotic or turbulent behavior.
no code implementations • 7 Jun 2022 • Jacob H. Seidman, Georgios Kissas, Paris Perdikaris, George J. Pappas
Supervised learning in function spaces is an emerging area of machine learning research with applications to the prediction of complex physical systems such as fluid flows, solid mechanics, and climate modeling.
1 code implementation • 5 Jul 2022 • Arka Daw, Jie Bu, Sifan Wang, Paris Perdikaris, Anuj Karpatne
In this paper, we provide a novel perspective of failure modes of PINNs by hypothesizing that training PINNs relies on successful "propagation" of solution from initial and/or boundary condition points to interior points.
1 code implementation • 6 Sep 2022 • Sebastian Kaltenbach, Paris Perdikaris, Phaedon-Stelios Koutsourelakis
Neural Operators offer a powerful, data-driven tool for solving parametric PDEs as they can represent maps between infinite-dimensional function spaces.
1 code implementation • 8 Sep 2022 • Francisco Sahli Costabal, Simone Pezzuto, Paris Perdikaris
We approximate the eigenfunctions as well as the operators involved in the partial differential equations with finite elements.
1 code implementation • 3 Oct 2022 • Sifan Wang, Hanwen Wang, Jacob H. Seidman, Paris Perdikaris
Continuous neural representations have recently emerged as a powerful and flexible alternative to classical discretized representations of signals.
1 code implementation • 14 Feb 2023 • Mohamed Aziz Bhouri, Michael Joly, Robert Yu, Soumalya Sarkar, Paris Perdikaris
Several fundamental problems in science and engineering consist of global optimization tasks involving unknown high-dimensional (black-box) functions that map a set of controllable variables to the outcomes of an expensive experiment.
no code implementations • 20 Feb 2023 • Jacob H. Seidman, Georgios Kissas, George J. Pappas, Paris Perdikaris
Unsupervised learning with functional data is an emerging paradigm of machine learning research with applications to computer vision, climate modeling and physical systems.
no code implementations • 25 Feb 2023 • Zhiwei Fang, Sifan Wang, Paris Perdikaris
While the popularity of physics-informed neural networks (PINNs) is steadily rising, to this date, PINNs have not been successful in simulating multi-scale and singular perturbation problems.
no code implementations • 15 May 2023 • Thomas Beckers, Jacob Seidman, Paris Perdikaris, George J. Pappas
Data-driven approaches achieve remarkable results for the modeling of complex dynamics based on collected data.
1 code implementation • 18 May 2023 • Shunyuan Mao, Ruobing Dong, Lu Lu, Kwang Moo Yi, Sifan Wang, Paris Perdikaris
We develop a tool, which we name Protoplanetary Disk Operator Network (PPDONet), that can predict the solution of disk-planet interactions in protoplanetary disks in real-time.
1 code implementation • 16 Aug 2023 • Sifan Wang, Shyam Sankaran, Hanwen Wang, Paris Perdikaris
Physics-informed neural networks (PINNs) have been popularized as a deep learning framework that can seamlessly synthesize observational data and partial differential equation (PDE) constraints.
no code implementations • 24 Aug 2023 • Zhiwei Fang, Sifan Wang, Paris Perdikaris
By reformulating the PDEs into boundary integral equations (BIEs), we can train the operator network solely on the boundary of the domain.
1 code implementation • 1 Feb 2024 • Sifan Wang, Bowen Li, Yuhan Chen, Paris Perdikaris
While physics-informed neural networks (PINNs) have become a popular deep learning framework for tackling forward and inverse problems governed by partial differential equations (PDEs), their performance is known to degrade when larger and deeper neural network architectures are employed.