no code implementations • 9 Feb 2024 • Ressi Bonti Muhammad, Apoorv Srivastava, Sergey Alyaev, Reidar Brumer Bratvold, Daniel M. Tartakovsky
We integrate an RL-based geosteering with PF to address realistic geosteering scenarios.
no code implementations • 23 Aug 2023 • Abhishek Chandra, Taniya Kapoor, Bram Daniels, Mitrofan Curti, Koen Tiels, Daniel M. Tartakovsky, Elena A. Lomonova
Hysteresis is a ubiquitous phenomenon in science and engineering; its modeling and identification are crucial for understanding and optimizing the behavior of various systems.
1 code implementation • 17 Aug 2023 • Taniya Kapoor, Abhishek Chandra, Daniel M. Tartakovsky, Hongrui Wang, Alfredo Nunez, Rolf Dollevoet
A primary challenge of physics-informed machine learning (PIML) is its generalization beyond the training domain, especially when dealing with complex physical problems represented by partial differential equations (PDEs).
no code implementations • 27 Jun 2023 • Hannah Lu, Daniel M. Tartakovsky
We present a data-driven learning approach for unknown nonautonomous dynamical systems with time-dependent inputs based on dynamic mode decomposition (DMD).
1 code implementation • 10 Feb 2023 • Abhishek Chandra, Bram Daniels, Mitrofan Curti, Koen Tiels, Elena A. Lomonova, Daniel M. Tartakovsky
This article presents an approach for modelling hysteresis in piezoelectric materials, that leverages recent advancements in machine learning, particularly in sparse-regression techniques.
no code implementations • 1 Nov 2022 • Wei Kang, Daniel M. Tartakovsky, Apoorv Srivastava
We introduce a mathematical formulation of feature-informed data assimilation (FIDA).
no code implementations • 4 Feb 2022 • Martha D'Eli, Hang Deng, Cedric Fraces, Krishna Garikipati, Lori Graham-Brady, Amanda Howard, Geoerge Karniadakid, Vahid Keshavarzzadeh, Robert M. Kirby, Nathan Kutz, Chunhui Li, Xing Liu, Hannah Lu, Pania Newell, Daniel O'Malley, Masa Prodanovic, Gowri Srinivasan, Alexandre Tartakovsky, Daniel M. Tartakovsky, Hamdi Tchelepi, Bozo Vazic, Hari Viswanathan, Hongkyu Yoon, Piotr Zarzycki
The "Workshop on Machine learning in heterogeneous porous materials" brought together international scientific communities of applied mathematics, porous media, and material sciences with experts in the areas of heterogeneous materials, machine learning (ML) and applied mathematics to identify how ML can advance materials research.
no code implementations • 24 Oct 2021 • Zitong Zhou, Nicholas Zabaras, Daniel M. Tartakovsky
We use a convolutional adversarial autoencoder (CAAE) for the parameterization of the heterogeneous non-Gaussian conductivity field with a low-dimensional latent representation.
no code implementations • 29 Apr 2021 • Dong H. Song, Daniel M. Tartakovsky
The former is reported relative to both CNN training on high-fidelity images only and Monte Carlo solution of the PDEs.
no code implementations • 22 Nov 2020 • Tyler E. Maltba, Hongli Zhao, Daniel M. Tartakovsky
When random excitations are modeled as Gaussian white noise, the joint PDF of neuron states satisfies exactly a Fokker-Planck equation.
no code implementations • 7 Sep 2020 • Søren Taverniers, Eric J. Hall, Markos A. Katsoulakis, Daniel M. Tartakovsky
Timely completion of design cycles for complex systems ranging from consumer electronics to hypersonic vehicles relies on rapid simulation-based prototyping.
no code implementations • 26 Jun 2020 • Eric J. Hall, Søren Taverniers, Markos A. Katsoulakis, Daniel M. Tartakovsky
We introduce the concept of a Graph-Informed Neural Network (GINN), a hybrid approach combining deep learning with probabilistic graphical models (PGMs) that acts as a surrogate for physics-based representations of multiscale and multiphysics systems.
1 code implementation • 15 Mar 2020 • Francesca Boso, Dimitris Boskos, Jorge Cortés, Sonia Martínez, Daniel M. Tartakovsky
This study focuses on the latter step by investigating the spatio-temporal evolution of data-driven ambiguity sets and their associated guarantees when the random QoIs they describe obey hyperbolic partial-differential equations with random inputs.
Optimization and Control Analysis of PDEs
no code implementations • 30 Jan 2020 • Joseph Bakarji, Daniel M. Tartakovsky
Statistical (machine learning) tools for equation discovery require large amounts of data that are typically computer generated rather than experimentally observed.
no code implementations • 6 Jan 2019 • Kimoon Um, Eric Joseph Hall, Markos A. Katsoulakis, Daniel M. Tartakovsky
The global sensitivity indices are used to rank the effect of uncertainty in microscopic parameters on macroscopic QoIs, to quantify the impact of causality on the multiscale model's predictions, and to provide physical interpretations of these results for hierarchical nanoporous materials.