no code implementations • 15 Jan 2025 • Zhiwei Gao, George Em Karniadakis
Additionally, we incorporate the active subspace method to reduce the parameter-space dimensionality, allowing us to improve the accuracy of predictions and obtain more reliable uncertainty estimates. Extensive experiments demonstrate the efficacy of our approach in various test cases, including scenarios with large datasets and high noise levels.
no code implementations • 7 Jan 2025 • Ruyin Wan, Ehsan Kharazmi, Michael S Triantafyllou, George Em Karniadakis
We introduce DeepVIVONet, a new framework for optimal dynamic reconstruction and forecasting of the vortex-induced vibrations (VIV) of a marine riser, using field data.
no code implementations • 2 Jan 2025 • Qian Zhang, Dmitry Krotov, George Em Karniadakis
We formulate reconstruction as a mapping from incomplete observed data to full reconstructed fields.
no code implementations • 21 Dec 2024 • Juan Diego Toscano, Li-Lian Wang, George Em Karniadakis
Inspired by the Kolmogorov-Arnold representation theorem and Kurkova's principle of using approximate representations, we propose the Kurkova-Kolmogorov-Arnold Network (KKAN), a new two-block architecture that combines robust multi-layer perceptron (MLP) based inner functions with flexible linear combinations of basis functions as outer functions.
no code implementations • 16 Dec 2024 • Kamaljyoti Nath, Varun Kumar, Daniel J. Smith, George Em Karniadakis
The objective of this study is to develop a computationally efficient neural network-based approach for identifying unknown parameters of a mean value diesel engine model to facilitate physics-based health monitoring and maintenance forecasting.
no code implementations • 15 Dec 2024 • Elham Kiyani, Manav Manav, Nikhil Kadivar, Laura De Lorenzis, George Em Karniadakis
Phase-field modeling reformulates fracture problems as energy minimization problems and enables a comprehensive characterization of the fracture process, including crack nucleation, propagation, merging, and branching, without relying on ad-hoc assumptions.
no code implementations • 11 Nov 2024 • Ruyin Wan, Qian Zhang, George Em Karniadakis
Spiking neural networks (SNNs) represent a promising approach in machine learning, combining the hierarchical learning capabilities of deep neural networks with the energy efficiency of spike-based computations.
no code implementations • 17 Oct 2024 • Juan Diego Toscano, Vivek Oommen, Alan John Varghese, Zongren Zou, Nazanin Ahmadi Daryakenari, Chenxi Wu, George Em Karniadakis
Physics-Informed Neural Networks (PINNs) have emerged as a key tool in Scientific Machine Learning since their introduction in 2017, enabling the efficient solution of ordinary and partial differential equations using sparse measurements.
no code implementations • 15 Sep 2024 • Tingwei Meng, Zongren Zou, Jérôme Darbon, George Em Karniadakis
The interplay between stochastic processes and optimal control has been extensively explored in the literature.
no code implementations • 13 Sep 2024 • Vivek Oommen, Aniruddha Bora, Zhen Zhang, George Em Karniadakis
We integrate neural operators with diffusion models to address the spectral limitations of neural operators in surrogate modeling of turbulent flows.
no code implementations • 5 Sep 2024 • Zheyuan Hu, Nazanin Ahmadi Daryakenari, Qianli Shen, Kenji Kawaguchi, George Em Karniadakis
We demonstrate Mamba's superior performance in both interpolation and challenging extrapolation tasks.
no code implementations • 2 Sep 2024 • Jin Song, Ming Zhong, George Em Karniadakis, Zhenya Yan
We propose a new two-stage initial-value iterative neural network (IINN) algorithm for solitary wave computations of nonlinear wave equations based on traditional numerical iterative methods and physics-informed neural networks (PINNs).
no code implementations • 29 Aug 2024 • Alan John Varghese, Zhen Zhang, George Em Karniadakis
Herein, we introduce Symplectic Graph Neural Networks (SympGNNs) that can effectively handle system identification in high-dimensional Hamiltonian systems, as well as node classification.
no code implementations • 29 Aug 2024 • Maziar Raissi, Paris Perdikaris, Nazanin Ahmadi, George Em Karniadakis
In this paper, we review the new method Physics-Informed Neural Networks (PINNs) that has become the main pillar in scientific machine learning, we present recent practical extensions, and provide a specific example in data-driven discovery of governing differential equations.
no code implementations • 13 Aug 2024 • Mario De Florio, Zongren Zou, Daniele E. Schiavazzi, George Em Karniadakis
With a specific focus on biological and physiological models, this study investigates the decomposition of total uncertainty in the estimation of states and parameters of a differential system simulated with MC X-TFC, a new physics-informed approach for uncertainty quantification based on random projections and Monte-Carlo sampling.
1 code implementation • 5 Aug 2024 • Varun Kumar, Somdatta Goswami, Katiana Kontolati, Michael D. Shields, George Em Karniadakis
Our approach is demonstrated on three benchmark problems: (1) learning different functional forms of the source term in the Fisher equation; (2) learning multiple geometries in a 2D Darcy Flow problem and showcasing better transfer learning capabilities to new geometries; and (3) learning 3D parameterized geometries for a heat transfer problem and demonstrate the ability to predict on new but similar geometries.
1 code implementation • 30 Jul 2024 • Khemraj Shukla, Zongren Zou, Chi Hin Chan, Additi Pandey, Zhicheng Wang, George Em Karniadakis
This study introduces NeuroSEM, a hybrid framework integrating PINNs with the high-fidelity Spectral Element Method (SEM) solver, Nektar++.
no code implementations • 22 Jul 2024 • Juan Diego Toscano, Theo Käufer, Zhibo Wang, Martin Maxey, Christian Cierpka, George Em Karniadakis
This physics-informed machine learning method enables us to infer continuous temperature fields using only sparse velocity data, hence eliminating the need for direct temperature measurements.
Kolmogorov-Arnold Networks Physics-informed machine learning
no code implementations • 17 Jun 2024 • Zheyuan Hu, Zhongqiang Zhang, George Em Karniadakis, Kenji Kawaguchi
We introduce an innovative approach for solving high-dimensional Fokker-Planck-L\'evy (FPL) equations in modeling non-Brownian processes across disciplines such as physics, finance, and ecology.
1 code implementation • 17 Jun 2024 • Zheyuan Hu, Kenji Kawaguchi, Zhongqiang Zhang, George Em Karniadakis
We validate our methods on various forward and inverse problems of fractional and tempered fractional PDEs, scaling up to 100, 000 dimensions.
no code implementations • 16 Jun 2024 • Youngkyu Lee, Alena Kopaničáková, George Em Karniadakis
We introduce a novel two-level overlapping additive Schwarz preconditioner for accelerating the training of scientific machine learning applications.
no code implementations • 5 Jun 2024 • Khemraj Shukla, Juan Diego Toscano, Zhicheng Wang, Zongren Zou, George Em Karniadakis
Kolmogorov-Arnold Networks (KANs) were recently introduced as an alternative representation model to MLP.
Kolmogorov-Arnold Networks Physics-informed machine learning
no code implementations • 29 May 2024 • Benjamin Shih, Ahmad Peyvan, Zhongqiang Zhang, George Em Karniadakis
Transformers have not been used in that capacity, and specifically, they have not been tested for solutions of PDEs with low regularity.
no code implementations • 20 May 2024 • Zongren Zou, Adar Kahana, Enrui Zhang, Eli Turkel, Rishikesh Ranade, Jay Pathak, George Em Karniadakis
We extend a recently proposed machine-learning-based iterative solver, i. e. the hybrid iterative transferable solver (HINTS), to solve the scattering problem described by the Helmholtz equation in an exterior domain with a complex absorbing boundary condition.
no code implementations • 30 Apr 2024 • Shupeng Wang, George Em Karniadakis
Our results demonstrate the effectiveness of GMC-PINNs in dealing with irregular domain problems and show a higher computational efficiency compared to the original fPINN method.
1 code implementation • 12 Apr 2024 • Zongren Zou, Tingwei Meng, Paula Chen, Jérôme Darbon, George Em Karniadakis
We provide several examples from SciML involving noisy data and \textit{epistemic uncertainty} to illustrate the potential advantages of our approach.
no code implementations • 27 Mar 2024 • Sokratis J. Anagnostopoulos, Juan Diego Toscano, Nikolaos Stergiopulos, George Em Karniadakis
We investigate the learning dynamics of fully-connected neural networks through the lens of gradient signal-to-noise ratio (SNR), examining the behavior of first-order optimizers like Adam in non-convex objectives.
no code implementations • 27 Feb 2024 • Qiao Zhuang, Chris Ziyi Yao, Zhongqiang Zhang, George Em Karniadakis
We propose a two-scale neural network method for solving partial differential equations (PDEs) with small parameters using physics-informed neural networks (PINNs).
no code implementations • 12 Feb 2024 • Zheyuan Hu, Zhongqiang Zhang, George Em Karniadakis, Kenji Kawaguchi
The score function, defined as the gradient of the LL, plays a fundamental role in inferring LL and PDF and enables fast SDE sampling.
1 code implementation • 16 Jan 2024 • Ahmad Peyvan, Vivek Oommen, Ameya D. Jagtap, George Em Karniadakis
Developing the proper representations for simulating high-speed flows with strong shock waves, rarefactions, and contact discontinuities has been a long-standing question in numerical analysis.
1 code implementation • 22 Dec 2023 • Zheyuan Hu, Zekun Shi, George Em Karniadakis, Kenji Kawaguchi
We further showcase HTE's convergence to the original PINN loss and its unbiased behavior under specific conditions.
1 code implementation • 21 Dec 2023 • Mario De Florio, Ioannis G. Kevrekidis, George Em Karniadakis
The performance of this framework is validated by recovering the right-hand sides and unknown terms of certain complex, chaotic systems such as the well-known Lorenz system, a six-dimensional hyperchaotic system, and the non-autonomous Sprott chaotic system, and comparing them with their known analytical expressions.
1 code implementation • 8 Dec 2023 • Vivek Oommen, Khemraj Shukla, Saaketh Desai, Remi Dingreville, George Em Karniadakis
This methodology is based on the integration of a community numerical solver with a U-Net neural operator, enhanced by a temporal-conditioning mechanism that enables accurate extrapolation and efficient time-to-solution predictions of the dynamics.
no code implementations • 5 Dec 2023 • Chenxi Wu, Alan John Varghese, Vivek Oommen, George Em Karniadakis
Herein, we consider 13 GPT-related papers across different scientific domains, reviewed by a human reviewer and SciSpace, a large language model, with the reviews evaluated by three distinct types of evaluators, namely GPT-3. 5, a crowd panel, and GPT-4.
no code implementations • 1 Dec 2023 • Youngeun Kim, Adar Kahana, Ruokai Yin, Yuhang Li, Panos Stinis, George Em Karniadakis, Priyadarshini Panda
In this work, we delve into the role of skip connections, a widely used concept in Artificial Neural Networks (ANNs), within the domain of SNNs with TTFS coding.
no code implementations • 26 Nov 2023 • Zheyuan Hu, Zhouhao Yang, Yezhen Wang, George Em Karniadakis, Kenji Kawaguchi
To optimize the bias-variance trade-off, we combine the two approaches in a hybrid method that balances the rapid convergence of the biased version with the high accuracy of the unbiased version.
no code implementations • 23 Nov 2023 • Hanxun Jin, Enrui Zhang, Boyu Zhang, Sridhar Krishnaswamy, George Em Karniadakis, Horacio D. Espinosa
Our work marks a significant advancement in the field of materials-by-design, potentially heralding a new era in the discovery and development of next-generation metamaterials with unparalleled mechanical characteristics derived directly from experimental insights.
no code implementations • 19 Nov 2023 • Zongren Zou, Xuhui Meng, George Em Karniadakis
As a result, UQ for noisy inputs becomes a crucial factor for reliable and trustworthy deployment of these models in applications involving physical knowledge.
no code implementations • 13 Nov 2023 • Paula Chen, Tingwei Meng, Zongren Zou, Jérôme Darbon, George Em Karniadakis
This connection allows us to reinterpret incremental updates to learned models as the evolution of an associated HJ PDE and optimal control problem in time, where all of the previous information is intrinsically encoded in the solution to the HJ PDE.
no code implementations • 30 Oct 2023 • Bin Lin, Zhiping Mao, Zhicheng Wang, George Em Karniadakis
Initially, we utilize DeepONet to learn the solution operator for a set of smooth problems relevant to the PDEs characterized by sharp solutions.
no code implementations • 16 Oct 2023 • Zongren Zou, Xuhui Meng, George Em Karniadakis
Despite the effectiveness of PINNs for discovering governing equations, the physical models encoded in PINNs may be misspecified in complex systems as some of the physical processes may not be fully understood, leading to the poor accuracy of PINN predictions.
no code implementations • 4 Oct 2023 • Felipe de Castro Teixeira Carvalho, Kamaljyoti Nath, Alberto Luiz Serpa, George Em Karniadakis
Electrical submersible pumps (ESPs) are prevalently utilized as artificial lift systems in the oil and gas industry.
1 code implementation • 3 Oct 2023 • Katarzyna Michałowska, Somdatta Goswami, George Em Karniadakis, Signe Riemer-Sørensen
Deep operator networks (DeepONets, DONs) offer a distinct advantage over traditional neural networks in their ability to be trained on multi-resolution data.
1 code implementation • 29 Sep 2023 • Nazanin Ahmadi Daryakenari, Mario De Florio, Khemraj Shukla, George Em Karniadakis
The proposed framework -- named AI-Aristotle -- combines eXtreme Theory of Functional Connections (X-TFC) domain-decomposition and Physics-Informed Neural Networks (PINNs) with symbolic regression (SR) techniques for parameter discovery and gray-box identification.
no code implementations • 31 Aug 2023 • Qian Zhang, Chenxi Wu, Adar Kahana, Youngeun Kim, Yuhang Li, George Em Karniadakis, Priyadarshini Panda
We introduce a method to convert Physics-Informed Neural Networks (PINNs), commonly used in scientific machine learning, to Spiking Neural Networks (SNNs), which are expected to have higher energy efficiency compared to traditional Artificial Neural Networks (ANNs).
1 code implementation • 9 Aug 2023 • Nikolas Borrel-Jensen, Somdatta Goswami, Allan P. Engsig-Karup, George Em Karniadakis, Cheol-Ho Jeong
We address the challenge of sound propagation simulations in 3D virtual rooms with moving sources, which have applications in virtual/augmented reality, game audio, and spatial computing.
1 code implementation • 23 Jul 2023 • Zheyuan Hu, Khemraj Shukla, George Em Karniadakis, Kenji Kawaguchi
We demonstrate in various diverse tests that the proposed method can solve many notoriously hard high-dimensional PDEs, including the Hamilton-Jacobi-Bellman (HJB) and the Schr\"{o}dinger equations in tens of thousands of dimensions very fast on a single GPU using the PINNs mesh-free approach.
no code implementations • 18 Jul 2023 • Elham Kiyani, Mahdi Kooshkbaghi, Khemraj Shukla, Rahul Babu Koneru, Zhen Li, Luis Bravo, Anindya Ghoshal, George Em Karniadakis, Mikko Karttunen
Subsequently, the closed form dependency of parameter values found by PINN on the initial radii and contact angles are given using symbolic regression.
no code implementations • 18 Jul 2023 • Oded Ovadia, Vivek Oommen, Adar Kahana, Ahmad Peyvan, Eli Turkel, George Em Karniadakis
The proposed method, named Diffusion-inspired Temporal Transformer Operator (DiTTO), is inspired by latent diffusion models and their conditioning mechanism, which we use to incorporate the temporal evolution of the PDE, in combination with elements from the transformer architecture to improve its capabilities.
no code implementations • 16 Jul 2023 • Zhen Zhang, Zongren Zou, Ellen Kuhl, George Em Karniadakis
Specifically, we integrate physics informed neural networks (PINNs) and symbolic regression to discover a reaction-diffusion type partial differential equation for tau protein misfolding and spreading.
1 code implementation • 5 Jul 2023 • Alan John Varghese, Aniruddha Bora, Mengjia Xu, George Em Karniadakis
Hence, incorporating long-range dependencies from the historical graph context plays a crucial role in accurately learning their temporal dynamics.
1 code implementation • 1 Jul 2023 • Sokratis J. Anagnostopoulos, Juan Diego Toscano, Nikolaos Stergiopulos, George Em Karniadakis
Driven by the need for more efficient and seamless integration of physical models and data, physics-informed neural networks (PINNs) have seen a surge of interest in recent years.
1 code implementation • 30 Jun 2023 • Alena Kopaničáková, Hardik Kothari, George Em Karniadakis, Rolf Krause
We propose to enhance the training of physics-informed neural networks (PINNs).
no code implementations • 27 Jun 2023 • Varun Kumar, Leonard Gleyzer, Adar Kahana, Khemraj Shukla, George Em Karniadakis
To demonstrate the flow of the MyCrunchGPT, and create an infrastructure that can facilitate a broader vision, we built a webapp based guided user interface, that includes options for a comprehensive summary report.
no code implementations • 18 May 2023 • Elham Kiyani, Khemraj Shukla, George Em Karniadakis, Mikko Karttunen
In addition, symbolic regression is employed to determine the closed form of the unknown part of the equation from the data, and the results confirm the accuracy of the X-PINNs based approach.
no code implementations • 4 May 2023 • Minglang Yin, Zongren Zou, Enrui Zhang, Cristina Cavinato, Jay D. Humphrey, George Em Karniadakis
Quantifying biomechanical properties of the human vasculature could deepen our understanding of cardiovascular diseases.
no code implementations • 26 Apr 2023 • Kamaljyoti Nath, Xuhui Meng, Daniel J Smith, George Em Karniadakis
In other words, the mean value model uses both the PINN model and the DNNs to represent the engine's states, with the PINN providing a physics-based understanding of the engine's overall dynamics and the DNNs offering a more engine-specific and adaptive representation of the empirical formulae.
1 code implementation • 15 Apr 2023 • Katiana Kontolati, Somdatta Goswami, George Em Karniadakis, Michael D. Shields
Operator regression provides a powerful means of constructing discretization-invariant emulators for partial-differential equations (PDEs) describing physical systems.
no code implementations • 2 Apr 2023 • Varun Kumar, Somdatta Goswami, Daniel J. Smith, George Em Karniadakis
As an alternative to physics based models, we develop an operator-based regression model (DeepONet) to learn the relevant output states for a mean-value gas flow engine model using the engine operating conditions as input variables.
1 code implementation • 22 Mar 2023 • Paula Chen, Tingwei Meng, Zongren Zou, Jérôme Darbon, George Em Karniadakis
Hamilton-Jacobi partial differential equations (HJ PDEs) have deep connections with a wide range of fields, including optimal control, differential games, and imaging sciences.
1 code implementation • 19 Mar 2023 • Qianying Cao, Somdatta Goswami, George Em Karniadakis
Herein, we demonstrate the superior approximation accuracy of a single Laplace layer in LNO over four Fourier modules in FNO in approximating the solutions of three ODEs (Duffing oscillator, driven gravity pendulum, and Lorenz system) and three PDEs (Euler-Bernoulli beam, diffusion equation, and reaction-diffusion system).
no code implementations • 15 Mar 2023 • Oded Ovadia, Adar Kahana, Panos Stinis, Eli Turkel, George Em Karniadakis
We combine vision transformers with operator learning to solve diverse inverse problems described by partial differential equations (PDEs).
no code implementations • 3 Mar 2023 • Katarzyna Michałowska, Somdatta Goswami, George Em Karniadakis, Signe Riemer-Sørensen
Deep neural networks are an attractive alternative for simulating complex dynamical systems, as in comparison to traditional scientific computing methods, they offer reduced computational costs during inference and can be trained directly from observational data.
1 code implementation • 28 Feb 2023 • Michael Penwarden, Ameya D. Jagtap, Shandian Zhe, George Em Karniadakis, Robert M. Kirby
This problem is also found in, and in some sense more difficult, with domain decomposition strategies such as temporal decomposition using XPINNs.
1 code implementation • 23 Feb 2023 • Somdatta Goswami, Ameya D. Jagtap, Hessam Babaee, Bryan T. Susi, George Em Karniadakis
Specifically, to train the DeepONet for the syngas model, we solve the skeletal kinetic model for different initial conditions.
no code implementations • 7 Feb 2023 • Aniruddha Bora, Khemraj Shukla, Shixuan Zhang, Bryce Harrop, Ruby Leung, George Em Karniadakis
In this study, we replace the bias correction process with a surrogate model based on the Deep Operator Network (DeepONet).
no code implementations • 2 Feb 2023 • Khemraj Shukla, Vivek Oommen, Ahmad Peyvan, Michael Penwarden, Luis Bravo, Anindya Ghoshal, Robert M. Kirby, George Em Karniadakis
Deep neural operators, such as DeepONets, have changed the paradigm in high-dimensional nonlinear regression from function regression to (differential) operator regression, paving the way for significant changes in computational engineering applications.
no code implementations • 26 Jan 2023 • Qizhi He, Mauro Perego, Amanda A. Howard, George Em Karniadakis, Panos Stinis
One of the most challenging and consequential problems in climate modeling is to provide probabilistic projections of sea level rise.
no code implementations • 5 Jan 2023 • Zongren Zou, George Em Karniadakis
We introduce multi-head neural networks (MH-NNs) to physics-informed machine learning, which is a type of neural networks (NNs) with all nonlinear hidden layers as the body and multiple linear output layers as multi-head.
1 code implementation • 13 Dec 2022 • Min Zhu, Handi Zhang, Anran Jiao, George Em Karniadakis, Lu Lu
Deep neural operators can learn nonlinear mappings between infinite-dimensional function spaces via deep neural networks.
no code implementations • 17 Nov 2022 • Qian Zhang, Adar Kahana, George Em Karniadakis, Panos Stinis
We propose a Spiking Neural Network (SNN)-based explicit numerical scheme for long time integration of time-dependent Ordinary and Partial Differential Equations (ODEs, PDEs).
1 code implementation • 16 Nov 2022 • Zheyuan Hu, Ameya D. Jagtap, George Em Karniadakis, Kenji Kawaguchi
We also show cases where XPINN is already better than PINN, so APINN can still slightly improve XPINN.
no code implementations • 6 Sep 2022 • Ameya D. Jagtap, George Em Karniadakis
For this purpose, we also discuss various requirements for activation functions that have been used in the physics-informed machine learning framework.
no code implementations • 28 Aug 2022 • Enrui Zhang, Adar Kahana, Alena Kopaničáková, Eli Turkel, Rishikesh Ranade, Jay Pathak, George Em Karniadakis
Neural networks suffer from spectral bias having difficulty in representing the high frequency components of a function while relaxation methods can resolve high frequencies efficiently but stall at moderate to low frequencies.
1 code implementation • 25 Aug 2022 • Zongren Zou, Xuhui Meng, Apostolos F Psaros, George Em Karniadakis
In this paper, we present an open-source Python library (https://github. com/Crunch-UQ4MI), termed NeuralUQ and accompanied by an educational tutorial, for employing UQ methods for SciML in a convenient and structured manner.
no code implementations • 21 Aug 2022 • Enrui Zhang, Bart Spronck, Jay D. Humphrey, George Em Karniadakis
Many genetic mutations adversely affect the structure and function of load-bearing soft tissues, with clinical sequelae often responsible for disability or death.
1 code implementation • 8 Jul 2022 • Somdatta Goswami, Aniruddha Bora, Yue Yu, George Em Karniadakis
Standard neural networks can approximate general nonlinear operators, represented either explicitly by a combination of mathematical operators, e. g., in an advection-diffusion-reaction partial differential equation, or simply as a black box, e. g., a system-of-systems.
no code implementations • 17 May 2022 • Adar Kahana, Qian Zhang, Leonard Gleyzer, George Em Karniadakis
We demonstrate this new approach for classification using the SNN in the branch, achieving results comparable to the literature.
no code implementations • 16 May 2022 • Khemraj Shukla, Mengjia Xu, Nathaniel Trask, George Em Karniadakis
For more complex systems or systems of systems and unstructured data, graph neural networks (GNNs) present some distinct advantages, and here we review how physics-informed learning can be accomplished with GNNs based on graph exterior calculus to construct differential operators; we refer to these architectures as physics-informed graph networks (PIGNs).
BIG-bench Machine Learning Physics-informed machine learning
no code implementations • 12 May 2022 • Kevin Linka, Amelie Schafer, Xuhui Meng, Zongren Zou, George Em Karniadakis, Ellen Kuhl
Our study reveals the inherent advantages and disadvantages of Neural Networks, Bayesian Inference, and a combination of both and provides valuable guidelines for model selection.
no code implementations • 8 May 2022 • Somdatta Goswami, David S. Li, Bruno V. Rego, Marcos Latorre, Jay D. Humphrey, George Em Karniadakis
Thoracic aortic aneurysm (TAA) is a localized dilatation of the aorta resulting from compromised wall composition, structure, and function, which can lead to life-threatening dissection or rupture.
1 code implementation • 20 Apr 2022 • Somdatta Goswami, Katiana Kontolati, Michael D. Shields, George Em Karniadakis
Transfer learning (TL) enables the transfer of knowledge gained in learning to perform one task (source) to a related but different task (target), hence addressing the expense of data acquisition and labeling, potential computational power limitations, and dataset distribution mismatches.
no code implementations • 11 Apr 2022 • Vivek Oommen, Khemraj Shukla, Somdatta Goswami, Remi Dingreville, George Em Karniadakis
We utilize the convolutional autoencoder to provide a compact representation of the microstructure data in a low-dimensional latent space.
no code implementations • 5 Apr 2022 • Ethan Pickering, Stephen Guth, George Em Karniadakis, Themistoklis P. Sapsis
This model-agnostic framework pairs a BED scheme that actively selects data for quantifying extreme events with an ensemble of DNOs that approximate infinite-dimensional nonlinear operators.
1 code implementation • 9 Mar 2022 • Katiana Kontolati, Somdatta Goswami, Michael D. Shields, George Em Karniadakis
In contrast, an even highly over-parameterized DeepONet leads to better generalization for both smooth and non-smooth dynamics.
no code implementations • 25 Feb 2022 • Minglang Yin, Enrui Zhang, Yue Yu, George Em Karniadakis
In this work, we explore the idea of multiscale modeling with machine learning and employ DeepONet, a neural operator, as an efficient surrogate of the expensive solver.
no code implementations • 23 Feb 2022 • Ameya D. Jagtap, Zhiping Mao, Nikolaus Adams, George Em Karniadakis
Accurate solutions to inverse supersonic compressible flow problems are often required for designing specialized aerospace vehicles.
2 code implementations • 3 Feb 2022 • Mitchell Daneker, Zhen Zhang, George Em Karniadakis, Lu Lu
The dynamics of systems biological processes are usually modeled by a system of ordinary differential equations (ODEs) with many unknown parameters that need to be inferred from noisy and sparse measurements.
1 code implementation • 19 Jan 2022 • Apostolos F Psaros, Xuhui Meng, Zongren Zou, Ling Guo, George Em Karniadakis
Neural networks (NNs) are currently changing the computational paradigm on how to combine data with mathematical laws in physics and engineering in a profound way, tackling challenging inverse and ill-posed problems not solvable with traditional methods.
1 code implementation • 14 Jan 2022 • Tingwei Meng, Zhen Zhang, Jérôme Darbon, George Em Karniadakis
Solving high-dimensional optimal control problems in real-time is an important but challenging problem, with applications to multi-agent path planning problems, which have drawn increased attention given the growing popularity of drones in recent years.
3 code implementations • 1 Nov 2021 • Jeremy Yu, Lu Lu, Xuhui Meng, George Em Karniadakis
We tested gPINNs extensively and demonstrated the effectiveness of gPINNs in both forward and inverse PDE problems.
1 code implementation • 28 Sep 2021 • Mengjia Xu, Apoorva Vikram Singh, George Em Karniadakis
However, recent advances mostly focus on learning node embeddings as deterministic "vectors" for static graphs yet disregarding the key graph temporal dynamics and the evolving uncertainties associated with node embedding in the latent space.
no code implementations • 20 Sep 2021 • Zheyuan Hu, Ameya D. Jagtap, George Em Karniadakis, Kenji Kawaguchi
Specifically, for general multi-layer PINNs and XPINNs, we first provide a prior generalization bound via the complexity of the target functions in the PDE problem, and a posterior generalization bound via the posterior matrix norms of the networks after optimization.
1 code implementation • 31 Aug 2021 • Zhen Zhang, Yeonjong Shin, George Em Karniadakis
We propose the GENERIC formalism informed neural networks (GFINNs) that obey the symmetric degeneracy conditions of the GENERIC formalism.
no code implementations • 25 Aug 2021 • Minglang Yin, Ehsan Ban, Bruno V. Rego, Enrui Zhang, Cristina Cavinato, Jay D. Humphrey, George Em Karniadakis
Aortic dissection progresses via delamination of the medial layer of the wall.
no code implementations • 12 Jul 2021 • Apostolos F Psaros, Kenji Kawaguchi, George Em Karniadakis
In the computational examples, the meta-learned losses are employed at test time for addressing regression and PDE task distributions.
no code implementations • 8 Jun 2021 • Xuhui Meng, Liu Yang, Zhiping Mao, Jose del Aguila Ferrandis, George Em Karniadakis
In summary, the proposed method is capable of learning flexible functional priors, and can be extended to big data problems using stochastic HMC or normalizing flows since the latent space is generally characterized as low dimensional.
no code implementations • 5 Jun 2021 • Qian Zhang, Konstantina Sampani, Mengjia Xu, Shengze Cai, Yixiang Deng, He Li, Jennifer K. Sun, George Em Karniadakis
Microaneurysms (MAs) are one of the earliest signs of diabetic retinopathy (DR), a frequent complication of diabetes that can lead to visual impairment and blindness.
no code implementations • 20 May 2021 • Shengze Cai, Zhiping Mao, Zhicheng Wang, Minglang Yin, George Em Karniadakis
Despite the significant progress over the last 50 years in simulating flow problems using numerical discretization of the Navier-Stokes equations (NSE), we still cannot incorporate seamlessly noisy data into existing algorithms, mesh-generation is complex, and we cannot tackle high-dimensional problems governed by parametrized NSE.
2 code implementations • 20 May 2021 • Ameya D. Jagtap, Yeonjong Shin, Kenji Kawaguchi, George Em Karniadakis
We propose a new type of neural networks, Kronecker neural networks (KNNs), that form a general framework for neural networks with adaptive activation functions.
no code implementations • 6 Apr 2021 • Yeonjong Shin, Jérôme Darbon, George Em Karniadakis
We propose three versions -- non-adaptive, adaptive terminal and adaptive order.
no code implementations • 17 Jan 2021 • Liu Yang, Tingwei Meng, George Em Karniadakis
We propose a simple but effective modification of the discriminators, namely measure-conditional discriminators, as a plug-and-play module for different GANs.
no code implementations • 23 Dec 2020 • Chensen Lin, Zhen Li, Lu Lu, Shengze Cai, Martin Maxey, George Em Karniadakis
Simulating and predicting multiscale problems that couple multiple physics and dynamics across many orders of spatiotemporal scales is a great challenge that has not been investigated systematically by deep neural networks (DNNs).
Computational Physics
no code implementations • 19 Dec 2020 • Xuhui Meng, Hessam Babaee, George Em Karniadakis
We propose a new class of Bayesian neural networks (BNNs) that can be trained using noisy data of variable fidelity, and we apply them to learn function approximations as well as to solve inverse problems based on partial differential equations (PDEs).
1 code implementation • 5 Dec 2020 • Pengzhan Jin, Zhen Zhang, Ioannis G. Kevrekidis, George Em Karniadakis
We propose the Poisson neural networks (PNNs) to learn Poisson systems and trajectories of autonomous systems from data.