no code implementations • 4 Mar 2024 • Shuvayan Brahmachary, Subodh M. Joshi, Aniruddha Panda, Kaushik Koneripalli, Arun Kumar Sagotra, Harshil Patel, Ankush Sharma, Ameya D. Jagtap, Kaushic Kalyanaraman
Large Language Models (LLMs) have demonstrated remarkable reasoning abilities, prompting interest in their application as black-box optimizers.
1 code implementation • 16 Jan 2024 • Ahmad Peyvan, Vivek Oommen, Ameya D. Jagtap, George Em Karniadakis
Developing the proper representations for simulating high-speed flows with strong shock waves, rarefactions, and contact discontinuities has been a long-standing question in numerical analysis.
no code implementations • 18 Sep 2023 • Tatiana Kossaczká, Ameya D. Jagtap, Matthias Ehrhardt
In this paper, we introduce an improved version of the fifth-order weighted essentially non-oscillatory (WENO) shock-capturing scheme by incorporating deep learning techniques.
1 code implementation • 28 Feb 2023 • Michael Penwarden, Ameya D. Jagtap, Shandian Zhe, George Em Karniadakis, Robert M. Kirby
This problem is also found in, and in some sense more difficult, with domain decomposition strategies such as temporal decomposition using XPINNs.
1 code implementation • 23 Feb 2023 • Somdatta Goswami, Ameya D. Jagtap, Hessam Babaee, Bryan T. Susi, George Em Karniadakis
Specifically, to train the DeepONet for the syngas model, we solve the skeletal kinetic model for different initial conditions.
1 code implementation • 16 Nov 2022 • Zheyuan Hu, Ameya D. Jagtap, George Em Karniadakis, Kenji Kawaguchi
We also show cases where XPINN is already better than PINN, so APINN can still slightly improve XPINN.
no code implementations • 6 Sep 2022 • Ameya D. Jagtap, George Em Karniadakis
For this purpose, we also discuss various requirements for activation functions that have been used in the physics-informed machine learning framework.
no code implementations • 17 Mar 2022 • Tim De Ryck, Ameya D. Jagtap, Siddhartha Mishra
We prove rigorous bounds on the errors resulting from the approximation of the incompressible Navier-Stokes equations with (extended) physics informed neural networks.
no code implementations • 23 Feb 2022 • Ameya D. Jagtap, Zhiping Mao, Nikolaus Adams, George Em Karniadakis
Accurate solutions to inverse supersonic compressible flow problems are often required for designing specialized aerospace vehicles.
no code implementations • 20 Sep 2021 • Zheyuan Hu, Ameya D. Jagtap, George Em Karniadakis, Kenji Kawaguchi
Specifically, for general multi-layer PINNs and XPINNs, we first provide a prior generalization bound via the complexity of the target functions in the PDE problem, and a posterior generalization bound via the posterior matrix norms of the networks after optimization.
2 code implementations • 20 May 2021 • Ameya D. Jagtap, Yeonjong Shin, Kenji Kawaguchi, George Em Karniadakis
We propose a new type of neural networks, Kronecker neural networks (KNNs), that form a general framework for neural networks with adaptive activation functions.
no code implementations • 25 Sep 2019 • Ameya D. Jagtap, Kenji Kawaguchi, George Em. Karniadakis
Furthermore, the proposed methods with the slope recovery are shown to accelerate the training process.