Search Results for author: Ameya D. Jagtap

Found 12 papers, 5 papers with code

RiemannONets: Interpretable Neural Operators for Riemann Problems

1 code implementation16 Jan 2024 Ahmad Peyvan, Vivek Oommen, Ameya D. Jagtap, George Em Karniadakis

Developing the proper representations for simulating high-speed flows with strong shock waves, rarefactions, and contact discontinuities has been a long-standing question in numerical analysis.

Deep smoothness WENO scheme for two-dimensional hyperbolic conservation laws: A deep learning approach for learning smoothness indicators

no code implementations18 Sep 2023 Tatiana Kossaczká, Ameya D. Jagtap, Matthias Ehrhardt

In this paper, we introduce an improved version of the fifth-order weighted essentially non-oscillatory (WENO) shock-capturing scheme by incorporating deep learning techniques.

Learning stiff chemical kinetics using extended deep neural operators

1 code implementation23 Feb 2023 Somdatta Goswami, Ameya D. Jagtap, Hessam Babaee, Bryan T. Susi, George Em Karniadakis

Specifically, to train the DeepONet for the syngas model, we solve the skeletal kinetic model for different initial conditions.

Unity

How important are activation functions in regression and classification? A survey, performance comparison, and future directions

no code implementations6 Sep 2022 Ameya D. Jagtap, George Em Karniadakis

For this purpose, we also discuss various requirements for activation functions that have been used in the physics-informed machine learning framework.

Physics-informed machine learning regression

Error estimates for physics informed neural networks approximating the Navier-Stokes equations

no code implementations17 Mar 2022 Tim De Ryck, Ameya D. Jagtap, Siddhartha Mishra

We prove rigorous bounds on the errors resulting from the approximation of the incompressible Navier-Stokes equations with (extended) physics informed neural networks.

Physics-informed neural networks for inverse problems in supersonic flows

no code implementations23 Feb 2022 Ameya D. Jagtap, Zhiping Mao, Nikolaus Adams, George Em Karniadakis

Accurate solutions to inverse supersonic compressible flow problems are often required for designing specialized aerospace vehicles.

When Do Extended Physics-Informed Neural Networks (XPINNs) Improve Generalization?

no code implementations20 Sep 2021 Zheyuan Hu, Ameya D. Jagtap, George Em Karniadakis, Kenji Kawaguchi

Specifically, for general multi-layer PINNs and XPINNs, we first provide a prior generalization bound via the complexity of the target functions in the PDE problem, and a posterior generalization bound via the posterior matrix norms of the networks after optimization.

Deep Kronecker neural networks: A general framework for neural networks with adaptive activation functions

2 code implementations20 May 2021 Ameya D. Jagtap, Yeonjong Shin, Kenji Kawaguchi, George Em Karniadakis

We propose a new type of neural networks, Kronecker neural networks (KNNs), that form a general framework for neural networks with adaptive activation functions.

Cannot find the paper you are looking for? You can Submit a new open access paper.