Search Results for author: Daniel M. Tartakovsky

Found 15 papers, 3 papers with code

Neural oscillators for magnetic hysteresis modeling

no code implementations23 Aug 2023 Abhishek Chandra, Taniya Kapoor, Bram Daniels, Mitrofan Curti, Koen Tiels, Daniel M. Tartakovsky, Elena A. Lomonova

Hysteresis is a ubiquitous phenomenon in science and engineering; its modeling and identification are crucial for understanding and optimizing the behavior of various systems.

Neural oscillators for generalization of physics-informed machine learning

1 code implementation17 Aug 2023 Taniya Kapoor, Abhishek Chandra, Daniel M. Tartakovsky, Hongrui Wang, Alfredo Nunez, Rolf Dollevoet

A primary challenge of physics-informed machine learning (PIML) is its generalization beyond the training domain, especially when dealing with complex physical problems represented by partial differential equations (PDEs).

Physics-informed machine learning

Learning Nonautonomous Systems via Dynamic Mode Decomposition

no code implementations27 Jun 2023 Hannah Lu, Daniel M. Tartakovsky

We present a data-driven learning approach for unknown nonautonomous dynamical systems with time-dependent inputs based on dynamic mode decomposition (DMD).

Dimensionality Reduction

Discovery of sparse hysteresis models for piezoelectric materials

1 code implementation10 Feb 2023 Abhishek Chandra, Bram Daniels, Mitrofan Curti, Koen Tiels, Elena A. Lomonova, Daniel M. Tartakovsky

This article presents an approach for modelling hysteresis in piezoelectric materials, that leverages recent advancements in machine learning, particularly in sparse-regression techniques.

regression

Feature-Informed Data Assimilation -- Definitions and Illustrative Examples

no code implementations1 Nov 2022 Wei Kang, Daniel M. Tartakovsky, Apoorv Srivastava

We introduce a mathematical formulation of feature-informed data assimilation (FIDA).

Machine Learning in Heterogeneous Porous Materials

no code implementations4 Feb 2022 Martha D'Eli, Hang Deng, Cedric Fraces, Krishna Garikipati, Lori Graham-Brady, Amanda Howard, Geoerge Karniadakid, Vahid Keshavarzzadeh, Robert M. Kirby, Nathan Kutz, Chunhui Li, Xing Liu, Hannah Lu, Pania Newell, Daniel O'Malley, Masa Prodanovic, Gowri Srinivasan, Alexandre Tartakovsky, Daniel M. Tartakovsky, Hamdi Tchelepi, Bozo Vazic, Hari Viswanathan, Hongkyu Yoon, Piotr Zarzycki

The "Workshop on Machine learning in heterogeneous porous materials" brought together international scientific communities of applied mathematics, porous media, and material sciences with experts in the areas of heterogeneous materials, machine learning (ML) and applied mathematics to identify how ML can advance materials research.

BIG-bench Machine Learning

Deep Learning for Simultaneous Inference of Hydraulic and Transport Properties

no code implementations24 Oct 2021 Zitong Zhou, Nicholas Zabaras, Daniel M. Tartakovsky

We use a convolutional adversarial autoencoder (CAAE) for the parameterization of the heterogeneous non-Gaussian conductivity field with a low-dimensional latent representation.

Computational Efficiency

Transfer Learning on Multi-Fidelity Data

no code implementations29 Apr 2021 Dong H. Song, Daniel M. Tartakovsky

The former is reported relative to both CNN training on high-fidelity images only and Monte Carlo solution of the PDEs.

Transfer Learning

Autonomous learning of nonlocal stochastic neuron dynamics

no code implementations22 Nov 2020 Tyler E. Maltba, Hongli Zhao, Daniel M. Tartakovsky

When random excitations are modeled as Gaussian white noise, the joint PDF of neuron states satisfies exactly a Fokker-Planck equation.

Mutual Information for Explainable Deep Learning of Multiscale Systems

no code implementations7 Sep 2020 Søren Taverniers, Eric J. Hall, Markos A. Katsoulakis, Daniel M. Tartakovsky

Timely completion of design cycles for complex systems ranging from consumer electronics to hypersonic vehicles relies on rapid simulation-based prototyping.

Uncertainty Quantification

GINNs: Graph-Informed Neural Networks for Multiscale Physics

no code implementations26 Jun 2020 Eric J. Hall, Søren Taverniers, Markos A. Katsoulakis, Daniel M. Tartakovsky

We introduce the concept of a Graph-Informed Neural Network (GINN), a hybrid approach combining deep learning with probabilistic graphical models (PGMs) that acts as a surrogate for physics-based representations of multiscale and multiphysics systems.

Decision Making

Dynamics of Data-driven Ambiguity Sets for Hyperbolic Conservation Laws with Uncertain Inputs

1 code implementation15 Mar 2020 Francesca Boso, Dimitris Boskos, Jorge Cortés, Sonia Martínez, Daniel M. Tartakovsky

This study focuses on the latter step by investigating the spatio-temporal evolution of data-driven ambiguity sets and their associated guarantees when the random QoIs they describe obey hyperbolic partial-differential equations with random inputs.

Optimization and Control Analysis of PDEs

Data-Driven Discovery of Coarse-Grained Equations

no code implementations30 Jan 2020 Joseph Bakarji, Daniel M. Tartakovsky

Statistical (machine learning) tools for equation discovery require large amounts of data that are typically computer generated rather than experimentally observed.

BIG-bench Machine Learning

Causality and Bayesian network PDEs for multiscale representations of porous media

no code implementations6 Jan 2019 Kimoon Um, Eric Joseph Hall, Markos A. Katsoulakis, Daniel M. Tartakovsky

The global sensitivity indices are used to rank the effect of uncertainty in microscopic parameters on macroscopic QoIs, to quantify the impact of causality on the multiscale model's predictions, and to provide physical interpretations of these results for hierarchical nanoporous materials.

Cannot find the paper you are looking for? You can Submit a new open access paper.