You need to log in to edit.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

no code implementations • 1 Dec 2023 • Youngeun Kim, Adar Kahana, Ruokai Yin, Yuhang Li, Panos Stinis, George Em Karniadakis, Priyadarshini Panda

In this work, we delve into the role of skip connections, a widely used concept in Artificial Neural Networks (ANNs), within the domain of SNNs with TTFS coding.

no code implementations • 26 Nov 2023 • Zheyuan Hu, Zhouhao Yang, Yezhen Wang, George Em Karniadakis, Kenji Kawaguchi

To optimize the bias-variance trade-off, we combine the two approaches in a hybrid method that balances the rapid convergence of the biased version with the high accuracy of the unbiased version.

no code implementations • 23 Nov 2023 • Hanxun Jin, Enrui Zhang, Boyu Zhang, Sridhar Krishnaswamy, George Em Karniadakis, Horacio D. Espinosa

Our work marks a significant advancement in the field of materials-by-design, potentially heralding a new era in the discovery and development of next-generation metamaterials with unparalleled mechanical characteristics derived directly from experimental insights.

no code implementations • 19 Nov 2023 • Zongren Zou, Xuhui Meng, George Em Karniadakis

As a result, UQ for noisy inputs becomes a crucial factor for reliable and trustworthy deployment of these models in applications involving physical knowledge.

no code implementations • 13 Nov 2023 • Paula Chen, Tingwei Meng, Zongren Zou, Jérôme Darbon, George Em Karniadakis

This connection allows us to reinterpret incremental updates to learned models as the evolution of an associated HJ PDE and optimal control problem in time, where all of the previous information is intrinsically encoded in the solution to the HJ PDE.

no code implementations • 30 Oct 2023 • Bin Lin, Zhiping Mao, Zhicheng Wang, George Em Karniadakis

Initially, we utilize DeepONet to learn the solution operator for a set of smooth problems relevant to the PDEs characterized by sharp solutions.

no code implementations • 16 Oct 2023 • Zongren Zou, Xuhui Meng, George Em Karniadakis

Despite the effectiveness of PINNs for discovering governing equations, the physical models encoded in PINNs may be misspecified in complex systems as some of the physical processes may not be fully understood, leading to the poor accuracy of PINN predictions.

no code implementations • 4 Oct 2023 • Felipe de Castro Teixeira Carvalho, Kamaljyoti Nath, Alberto Luiz Serpa, George Em Karniadakis

In this paper, we formulate a machine learning model based on Physics-Informed Neural Networks (PINNs) to estimate crucial system parameters.

1 code implementation • 3 Oct 2023 • Katarzyna Michałowska, Somdatta Goswami, George Em Karniadakis, Signe Riemer-Sørensen

Deep operator networks (DeepONets, DONs) offer a distinct advantage over traditional neural networks in their ability to be trained on multi-resolution data.

no code implementations • 29 Sep 2023 • Nazanin Ahmadi Daryakenari, Mario De Florio, Khemraj Shukla, George Em Karniadakis

The proposed framework -- named AI-Aristotle -- combines eXtreme Theory of Functional Connections (X-TFC) domain-decomposition and Physics-Informed Neural Networks (PINNs) with symbolic regression (SR) techniques for parameter discovery and gray-box identification.

no code implementations • 31 Aug 2023 • Qian Zhang, Chenxi Wu, Adar Kahana, Youngeun Kim, Yuhang Li, George Em Karniadakis, Priyadarshini Panda

We introduce a method to convert Physics-Informed Neural Networks (PINNs), commonly used in scientific machine learning, to Spiking Neural Networks (SNNs), which are expected to have higher energy efficiency compared to traditional Artificial Neural Networks (ANNs).

no code implementations • 9 Aug 2023 • Nikolas Borrel-Jensen, Somdatta Goswami, Allan P. Engsig-Karup, George Em Karniadakis, Cheol-Ho Jeong

We address the challenge of sound propagation simulations in $3$D virtual rooms with moving sources, which have applications in virtual/augmented reality, game audio, and spatial computing.

no code implementations • 23 Jul 2023 • Zheyuan Hu, Khemraj Shukla, George Em Karniadakis, Kenji Kawaguchi

We demonstrate in various diverse tests that the proposed method can solve many notoriously hard high-dimensional PDEs, including the Hamilton-Jacobi-Bellman (HJB) and the Schr\"{o}dinger equations in tens of thousands of dimensions very fast on a single GPU using the PINNs mesh-free approach.

no code implementations • 18 Jul 2023 • Elham Kiyani, Mahdi Kooshkbaghi, Khemraj Shukla, Rahul Babu Koneru, Zhen Li, Luis Bravo, Anindya Ghoshal, George Em Karniadakis, Mikko Karttunen

Subsequently, the closed form dependency of parameter values found by PINN on the initial radii and contact angles are given using symbolic regression.

no code implementations • 18 Jul 2023 • Oded Ovadia, Eli Turkel, Adar Kahana, George Em Karniadakis

We also present a method to improve the performance of DiTTO by using fast sampling concepts from diffusion models.

no code implementations • 16 Jul 2023 • Zhen Zhang, Zongren Zou, Ellen Kuhl, George Em Karniadakis

Specifically, we integrate physics informed neural networks (PINNs) and symbolic regression to discover a reaction-diffusion type partial differential equation for tau protein misfolding and spreading.

no code implementations • 5 Jul 2023 • Alan John Varghese, Aniruddha Bora, Mengjia Xu, George Em Karniadakis

Hence, incorporating long-range dependencies from the historical graph context plays a crucial role in accurately learning their temporal dynamics.

1 code implementation • 1 Jul 2023 • Sokratis J. Anagnostopoulos, Juan Diego Toscano, Nikolaos Stergiopulos, George Em Karniadakis

Driven by the need for more efficient and seamless integration of physical models and data, physics-informed neural networks (PINNs) have seen a surge of interest in recent years.

no code implementations • 30 Jun 2023 • Alena Kopaničáková, Hardik Kothari, George Em Karniadakis, Rolf Krause

We propose to enhance the training of physics-informed neural networks (PINNs).

no code implementations • 27 Jun 2023 • Varun Kumar, Leonard Gleyzer, Adar Kahana, Khemraj Shukla, George Em Karniadakis

To demonstrate the flow of the MyCrunchGPT, and create an infrastructure that can facilitate a broader vision, we built a webapp based guided user interface, that includes options for a comprehensive summary report.

no code implementations • 18 May 2023 • Elham Kiyani, Khemraj Shukla, George Em Karniadakis, Mikko Karttunen

In addition, symbolic regression is employed to determine the closed form of the unknown part of the equation from the data, and the results confirm the accuracy of the X-PINNs based approach.

no code implementations • 4 May 2023 • Minglang Yin, Zongren Zou, Enrui Zhang, Cristina Cavinato, Jay D. Humphrey, George Em Karniadakis

Quantifying biomechanical properties of the human vasculature could deepen our understanding of cardiovascular diseases.

no code implementations • 26 Apr 2023 • Kamaljyoti Nath, Xuhui Meng, Daniel J Smith, George Em Karniadakis

In other words, the mean value model uses both the PINN model and the DNNs to represent the engine's states, with the PINN providing a physics-based understanding of the engine's overall dynamics and the DNNs offering a more engine-specific and adaptive representation of the empirical formulae.

1 code implementation • 15 Apr 2023 • Katiana Kontolati, Somdatta Goswami, George Em Karniadakis, Michael D. Shields

Operator regression provides a powerful means of constructing discretization-invariant emulators for partial-differential equations (PDEs) describing physical systems.

no code implementations • 2 Apr 2023 • Varun Kumar, Somdatta Goswami, Daniel J. Smith, George Em Karniadakis

As an alternative to physics based models, we develop an operator-based regression model (DeepONet) to learn the relevant output states for a mean-value gas flow engine model using the engine operating conditions as input variables.

no code implementations • 22 Mar 2023 • Paula Chen, Tingwei Meng, Zongren Zou, Jérôme Darbon, George Em Karniadakis

Hamilton-Jacobi partial differential equations (HJ PDEs) have deep connections with a wide range of fields, including optimal control, differential games, and imaging sciences.

no code implementations • 19 Mar 2023 • Qianying Cao, Somdatta Goswami, George Em Karniadakis

Herein, we demonstrate the superior approximation accuracy of a single Laplace layer in LNO over four Fourier modules in FNO in approximating the solutions of three ODEs (Duffing oscillator, driven gravity pendulum, and Lorenz system) and three PDEs (Euler-Bernoulli beam, diffusion equation, and reaction-diffusion system).

no code implementations • 15 Mar 2023 • Oded Ovadia, Adar Kahana, Panos Stinis, Eli Turkel, George Em Karniadakis

We combine vision transformers with operator learning to solve diverse inverse problems described by partial differential equations (PDEs).

no code implementations • 3 Mar 2023 • Katarzyna Michałowska, Somdatta Goswami, George Em Karniadakis, Signe Riemer-Sørensen

Deep neural networks are an attractive alternative for simulating complex dynamical systems, as in comparison to traditional scientific computing methods, they offer reduced computational costs during inference and can be trained directly from observational data.

1 code implementation • 28 Feb 2023 • Michael Penwarden, Ameya D. Jagtap, Shandian Zhe, George Em Karniadakis, Robert M. Kirby

This problem is also found in, and in some sense more difficult, with domain decomposition strategies such as temporal decomposition using XPINNs.

1 code implementation • 23 Feb 2023 • Somdatta Goswami, Ameya D. Jagtap, Hessam Babaee, Bryan T. Susi, George Em Karniadakis

Specifically, to train the DeepONet for the syngas model, we solve the skeletal kinetic model for different initial conditions.

no code implementations • 7 Feb 2023 • Aniruddha Bora, Khemraj Shukla, Shixuan Zhang, Bryce Harrop, Ruby Leung, George Em Karniadakis

In this study, we replace the bias correction process with a surrogate model based on the Deep Operator Network (DeepONet).

no code implementations • 2 Feb 2023 • Khemraj Shukla, Vivek Oommen, Ahmad Peyvan, Michael Penwarden, Luis Bravo, Anindya Ghoshal, Robert M. Kirby, George Em Karniadakis

Deep neural operators, such as DeepONets, have changed the paradigm in high-dimensional nonlinear regression from function regression to (differential) operator regression, paving the way for significant changes in computational engineering applications.

no code implementations • 26 Jan 2023 • Qizhi He, Mauro Perego, Amanda A. Howard, George Em Karniadakis, Panos Stinis

One of the most challenging and consequential problems in climate modeling is to provide probabilistic projections of sea level rise.

no code implementations • 5 Jan 2023 • Zongren Zou, George Em Karniadakis

We introduce multi-head neural networks (MH-NNs) to physics-informed machine learning, which is a type of neural networks (NNs) with all nonlinear hidden layers as the body and multiple linear output layers as multi-head.

1 code implementation • 13 Dec 2022 • Min Zhu, Handi Zhang, Anran Jiao, George Em Karniadakis, Lu Lu

Deep neural operators can learn nonlinear mappings between infinite-dimensional function spaces via deep neural networks.

no code implementations • 17 Nov 2022 • Qian Zhang, Adar Kahana, George Em Karniadakis, Panos Stinis

We propose a Spiking Neural Network (SNN)-based explicit numerical scheme for long time integration of time-dependent Ordinary and Partial Differential Equations (ODEs, PDEs).

1 code implementation • 16 Nov 2022 • Zheyuan Hu, Ameya D. Jagtap, George Em Karniadakis, Kenji Kawaguchi

We also show cases where XPINN is already better than PINN, so APINN can still slightly improve XPINN.

no code implementations • 6 Sep 2022 • Ameya D. Jagtap, George Em Karniadakis

For this purpose, we also discuss various requirements for activation functions that have been used in the physics-informed machine learning framework.

no code implementations • 28 Aug 2022 • Enrui Zhang, Adar Kahana, Eli Turkel, Rishikesh Ranade, Jay Pathak, George Em Karniadakis

Based on recent advances in scientific deep learning for operator regression, we propose HINTS, a hybrid, iterative, numerical, and transferable solver for differential equations.

no code implementations • 25 Aug 2022 • Zongren Zou, Xuhui Meng, Apostolos F Psaros, George Em Karniadakis

In this paper, we present an open-source Python library (https://github. com/Crunch-UQ4MI), termed NeuralUQ and accompanied by an educational tutorial, for employing UQ methods for SciML in a convenient and structured manner.

no code implementations • 21 Aug 2022 • Enrui Zhang, Bart Spronck, Jay D. Humphrey, George Em Karniadakis

Many genetic mutations adversely affect the structure and function of load-bearing soft tissues, with clinical sequelae often responsible for disability or death.

no code implementations • 8 Jul 2022 • Somdatta Goswami, Aniruddha Bora, Yue Yu, George Em Karniadakis

Standard neural networks can approximate general nonlinear operators, represented either explicitly by a combination of mathematical operators, e. g., in an advection-diffusion-reaction partial differential equation, or simply as a black box, e. g., a system-of-systems.

no code implementations • 17 May 2022 • Adar Kahana, Qian Zhang, Leonard Gleyzer, George Em Karniadakis

We demonstrate this new approach for classification using the SNN in the branch, achieving results comparable to the literature.

no code implementations • 16 May 2022 • Khemraj Shukla, Mengjia Xu, Nathaniel Trask, George Em Karniadakis

For more complex systems or systems of systems and unstructured data, graph neural networks (GNNs) present some distinct advantages, and here we review how physics-informed learning can be accomplished with GNNs based on graph exterior calculus to construct differential operators; we refer to these architectures as physics-informed graph networks (PIGNs).

BIG-bench Machine Learning Physics-informed machine learning

no code implementations • 12 May 2022 • Kevin Linka, Amelie Schafer, Xuhui Meng, Zongren Zou, George Em Karniadakis, Ellen Kuhl

Our study reveals the inherent advantages and disadvantages of Neural Networks, Bayesian Inference, and a combination of both and provides valuable guidelines for model selection.

no code implementations • 8 May 2022 • Somdatta Goswami, David S. Li, Bruno V. Rego, Marcos Latorre, Jay D. Humphrey, George Em Karniadakis

Thoracic aortic aneurysm (TAA) is a localized dilatation of the aorta resulting from compromised wall composition, structure, and function, which can lead to life-threatening dissection or rupture.

1 code implementation • 20 Apr 2022 • Somdatta Goswami, Katiana Kontolati, Michael D. Shields, George Em Karniadakis

Transfer learning (TL) enables the transfer of knowledge gained in learning to perform one task (source) to a related but different task (target), hence addressing the expense of data acquisition and labeling, potential computational power limitations, and dataset distribution mismatches.

no code implementations • 11 Apr 2022 • Vivek Oommen, Khemraj Shukla, Somdatta Goswami, Remi Dingreville, George Em Karniadakis

We utilize the convolutional autoencoder to provide a compact representation of the microstructure data in a low-dimensional latent space.

no code implementations • 5 Apr 2022 • Ethan Pickering, Stephen Guth, George Em Karniadakis, Themistoklis P. Sapsis

This model-agnostic framework pairs a BED scheme that actively selects data for quantifying extreme events with an ensemble of DNOs that approximate infinite-dimensional nonlinear operators.

1 code implementation • 9 Mar 2022 • Katiana Kontolati, Somdatta Goswami, Michael D. Shields, George Em Karniadakis

In contrast, an even highly over-parameterized DeepONet leads to better generalization for both smooth and non-smooth dynamics.

no code implementations • 25 Feb 2022 • Minglang Yin, Enrui Zhang, Yue Yu, George Em Karniadakis

In this work, we explore the idea of multiscale modeling with machine learning and employ DeepONet, a neural operator, as an efficient surrogate of the expensive solver.

no code implementations • 23 Feb 2022 • Ameya D. Jagtap, Zhiping Mao, Nikolaus Adams, George Em Karniadakis

Accurate solutions to inverse supersonic compressible flow problems are often required for designing specialized aerospace vehicles.

2 code implementations • 3 Feb 2022 • Mitchell Daneker, Zhen Zhang, George Em Karniadakis, Lu Lu

The dynamics of systems biological processes are usually modeled by a system of ordinary differential equations (ODEs) with many unknown parameters that need to be inferred from noisy and sparse measurements.

1 code implementation • 19 Jan 2022 • Apostolos F Psaros, Xuhui Meng, Zongren Zou, Ling Guo, George Em Karniadakis

Neural networks (NNs) are currently changing the computational paradigm on how to combine data with mathematical laws in physics and engineering in a profound way, tackling challenging inverse and ill-posed problems not solvable with traditional methods.

1 code implementation • 14 Jan 2022 • Tingwei Meng, Zhen Zhang, Jérôme Darbon, George Em Karniadakis

Solving high-dimensional optimal control problems in real-time is an important but challenging problem, with applications to multi-agent path planning problems, which have drawn increased attention given the growing popularity of drones in recent years.

2 code implementations • 1 Nov 2021 • Jeremy Yu, Lu Lu, Xuhui Meng, George Em Karniadakis

We tested gPINNs extensively and demonstrated the effectiveness of gPINNs in both forward and inverse PDE problems.

1 code implementation • 28 Sep 2021 • Mengjia Xu, Apoorva Vikram Singh, George Em Karniadakis

However, recent advances mostly focus on learning node embeddings as deterministic "vectors" for static graphs yet disregarding the key graph temporal dynamics and the evolving uncertainties associated with node embedding in the latent space.

no code implementations • 20 Sep 2021 • Zheyuan Hu, Ameya D. Jagtap, George Em Karniadakis, Kenji Kawaguchi

Specifically, for general multi-layer PINNs and XPINNs, we first provide a prior generalization bound via the complexity of the target functions in the PDE problem, and a posterior generalization bound via the posterior matrix norms of the networks after optimization.

no code implementations • 31 Aug 2021 • Zhen Zhang, Yeonjong Shin, George Em Karniadakis

We propose the GENERIC formalism informed neural networks (GFINNs) that obey the symmetric degeneracy conditions of the GENERIC formalism.

no code implementations • 25 Aug 2021 • Minglang Yin, Ehsan Ban, Bruno V. Rego, Enrui Zhang, Cristina Cavinato, Jay D. Humphrey, George Em Karniadakis

Aortic dissection progresses via delamination of the medial layer of the wall.

no code implementations • 12 Jul 2021 • Apostolos F Psaros, Kenji Kawaguchi, George Em Karniadakis

In the computational examples, the meta-learned losses are employed at test time for addressing regression and PDE task distributions.

no code implementations • 8 Jun 2021 • Xuhui Meng, Liu Yang, Zhiping Mao, Jose del Aguila Ferrandis, George Em Karniadakis

In summary, the proposed method is capable of learning flexible functional priors, and can be extended to big data problems using stochastic HMC or normalizing flows since the latent space is generally characterized as low dimensional.

no code implementations • 5 Jun 2021 • Qian Zhang, Konstantina Sampani, Mengjia Xu, Shengze Cai, Yixiang Deng, He Li, Jennifer K. Sun, George Em Karniadakis

Microaneurysms (MAs) are one of the earliest signs of diabetic retinopathy (DR), a frequent complication of diabetes that can lead to visual impairment and blindness.

no code implementations • 20 May 2021 • Shengze Cai, Zhiping Mao, Zhicheng Wang, Minglang Yin, George Em Karniadakis

Despite the significant progress over the last 50 years in simulating flow problems using numerical discretization of the Navier-Stokes equations (NSE), we still cannot incorporate seamlessly noisy data into existing algorithms, mesh-generation is complex, and we cannot tackle high-dimensional problems governed by parametrized NSE.

2 code implementations • 20 May 2021 • Ameya D. Jagtap, Yeonjong Shin, Kenji Kawaguchi, George Em Karniadakis

We propose a new type of neural networks, Kronecker neural networks (KNNs), that form a general framework for neural networks with adaptive activation functions.

no code implementations • 6 Apr 2021 • Yeonjong Shin, Jérôme Darbon, George Em Karniadakis

We propose three versions -- non-adaptive, adaptive terminal and adaptive order.

no code implementations • 17 Jan 2021 • Liu Yang, Tingwei Meng, George Em Karniadakis

We propose a simple but effective modification of the discriminators, namely measure-conditional discriminators, as a plug-and-play module for different GANs.

no code implementations • 23 Dec 2020 • Chensen Lin, Zhen Li, Lu Lu, Shengze Cai, Martin Maxey, George Em Karniadakis

Simulating and predicting multiscale problems that couple multiple physics and dynamics across many orders of spatiotemporal scales is a great challenge that has not been investigated systematically by deep neural networks (DNNs).

Computational Physics

no code implementations • 19 Dec 2020 • Xuhui Meng, Hessam Babaee, George Em Karniadakis

We propose a new class of Bayesian neural networks (BNNs) that can be trained using noisy data of variable fidelity, and we apply them to learn function approximations as well as to solve inverse problems based on partial differential equations (PDEs).

1 code implementation • 5 Dec 2020 • Pengzhan Jin, Zhen Zhang, Ioannis G. Kevrekidis, George Em Karniadakis

We propose the Poisson neural networks (PNNs) to learn Poisson systems and trajectories of autonomous systems from data.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.