Search Results for author: Somdatta Goswami

Found 15 papers, 7 papers with code

DON-LSTM: Multi-Resolution Learning with DeepONets and Long Short-Term Memory Neural Networks

1 code implementation3 Oct 2023 Katarzyna Michałowska, Somdatta Goswami, George Em Karniadakis, Signe Riemer-Sørensen

Deep operator networks (DeepONets, DONs) offer a distinct advantage over traditional neural networks in their ability to be trained on multi-resolution data.

Sound propagation in realistic interactive 3D scenes with parameterized sources using deep neural operators

1 code implementation9 Aug 2023 Nikolas Borrel-Jensen, Somdatta Goswami, Allan P. Engsig-Karup, George Em Karniadakis, Cheol-Ho Jeong

We address the challenge of sound propagation simulations in 3D virtual rooms with moving sources, which have applications in virtual/augmented reality, game audio, and spatial computing.

Learning in latent spaces improves the predictive accuracy of deep neural operators

1 code implementation15 Apr 2023 Katiana Kontolati, Somdatta Goswami, George Em Karniadakis, Michael D. Shields

Operator regression provides a powerful means of constructing discretization-invariant emulators for partial-differential equations (PDEs) describing physical systems.

Computational Efficiency

Real-Time Prediction of Gas Flow Dynamics in Diesel Engines using a Deep Neural Operator Framework

no code implementations2 Apr 2023 Varun Kumar, Somdatta Goswami, Daniel J. Smith, George Em Karniadakis

As an alternative to physics based models, we develop an operator-based regression model (DeepONet) to learn the relevant output states for a mean-value gas flow engine model using the engine operating conditions as input variables.

LNO: Laplace Neural Operator for Solving Differential Equations

no code implementations19 Mar 2023 Qianying Cao, Somdatta Goswami, George Em Karniadakis

Herein, we demonstrate the superior approximation accuracy of a single Laplace layer in LNO over four Fourier modules in FNO in approximating the solutions of three ODEs (Duffing oscillator, driven gravity pendulum, and Lorenz system) and three PDEs (Euler-Bernoulli beam, diffusion equation, and reaction-diffusion system).

Operator learning

Neural Operator Learning for Long-Time Integration in Dynamical Systems with Recurrent Neural Networks

no code implementations3 Mar 2023 Katarzyna Michałowska, Somdatta Goswami, George Em Karniadakis, Signe Riemer-Sørensen

Deep neural networks are an attractive alternative for simulating complex dynamical systems, as in comparison to traditional scientific computing methods, they offer reduced computational costs during inference and can be trained directly from observational data.

Operator learning

Learning stiff chemical kinetics using extended deep neural operators

1 code implementation23 Feb 2023 Somdatta Goswami, Ameya D. Jagtap, Hessam Babaee, Bryan T. Susi, George Em Karniadakis

Specifically, to train the DeepONet for the syngas model, we solve the skeletal kinetic model for different initial conditions.

Unity

Physics-Informed Deep Neural Operator Networks

no code implementations8 Jul 2022 Somdatta Goswami, Aniruddha Bora, Yue Yu, George Em Karniadakis

Standard neural networks can approximate general nonlinear operators, represented either explicitly by a combination of mathematical operators, e. g., in an advection-diffusion-reaction partial differential equation, or simply as a black box, e. g., a system-of-systems.

Uncertainty Quantification

Neural operator learning of heterogeneous mechanobiological insults contributing to aortic aneurysms

no code implementations8 May 2022 Somdatta Goswami, David S. Li, Bruno V. Rego, Marcos Latorre, Jay D. Humphrey, George Em Karniadakis

Thoracic aortic aneurysm (TAA) is a localized dilatation of the aorta resulting from compromised wall composition, structure, and function, which can lead to life-threatening dissection or rupture.

Operator learning

Deep transfer operator learning for partial differential equations under conditional shift

1 code implementation20 Apr 2022 Somdatta Goswami, Katiana Kontolati, Michael D. Shields, George Em Karniadakis

Transfer learning (TL) enables the transfer of knowledge gained in learning to perform one task (source) to a related but different task (target), hence addressing the expense of data acquisition and labeling, potential computational power limitations, and dataset distribution mismatches.

Domain Adaptation Operator learning +4

Learning two-phase microstructure evolution using neural operators and autoencoder architectures

no code implementations11 Apr 2022 Vivek Oommen, Khemraj Shukla, Somdatta Goswami, Remi Dingreville, George Em Karniadakis

We utilize the convolutional autoencoder to provide a compact representation of the microstructure data in a low-dimensional latent space.

Vocal Bursts Valence Prediction

On the influence of over-parameterization in manifold based surrogates and deep neural operators

1 code implementation9 Mar 2022 Katiana Kontolati, Somdatta Goswami, Michael D. Shields, George Em Karniadakis

In contrast, an even highly over-parameterized DeepONet leads to better generalization for both smooth and non-smooth dynamics.

Operator learning

A physics-informed variational DeepONet for predicting the crack path in brittle materials

no code implementations16 Aug 2021 Somdatta Goswami, Minglang Yin, Yue Yu, George Karniadakis

We propose a physics-informed variational formulation of DeepONet (V-DeepONet) for brittle fracture analysis.

Transfer learning enhanced physics informed neural network for phase-field modeling of fracture

no code implementations4 Jul 2019 Somdatta Goswami, Cosmin Anitescu, Souvik Chakraborty, Timon Rabczuk

While most of the PINN algorithms available in the literature minimize the residual of the governing partial differential equation, the proposed approach takes a different path by minimizing the variational energy of the system.

Numerical Integration Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.