no code implementations • 18 Apr 2024 • Kislaya Ravi, Vladyslav Fediukov, Felix Dietrich, Tobias Neckel, Fabian Buse, Michael Bergmann, Hans-Joachim Bungartz
One of the main challenges in surrogate modeling is the limited availability of data due to resource constraints associated with computationally expensive simulations.
1 code implementation • 24 Mar 2024 • Chinmay Datar, Adwait Datar, Felix Dietrich, Wil Schilders
Discovering a suitable neural network architecture for modeling complex dynamical systems poses a formidable challenge, often involving extensive trial and error and navigation through a high-dimensional hyper-parameter space.
no code implementations • 20 Dec 2023 • Erez Peterfreund, Iryna Burak, Ofir Lindenbaum, Jim Gimlett, Felix Dietrich, Ronald R. Coifman, Ioannis G. Kevrekidis
Fusing measurements from multiple, heterogeneous, partial sources, observing a common object or process, poses challenges due to the increasing availability of numbers and types of sensors.
no code implementations • 24 Apr 2023 • Ioannis K. Gallos, Daniel Lehmberg, Felix Dietrich, Constantinos Siettos
Importantly, we show that the proposed Koopman operator approach provides, for any practical purposes, equivalent results to the FNN-GH approach, thus bypassing the need to train a non-linear map and to use GH to extrapolate predictions in the ambient fMRI space; one can use instead the low-frequency truncation of the DMs function space of L^2-integrable functions, to predict the entire list of coordinate functions in the fMRI space and to solve the pre-image problem.
no code implementations • 22 Nov 2022 • Danimir T. Doncevic, Alexander Mitsos, Yue Guo, Qianxiao Li, Felix Dietrich, Manuel Dahmen, Ioannis G. Kevrekidis
Meta-learning of numerical algorithms for a given task consists of the data-driven identification and adaptation of an algorithmic structure and the associated hyperparameters.
1 code implementation • 1 Aug 2022 • Philipp Scholl, Felix Dietrich, Clemens Otte, Steffen Udluft
Based on this finding, we develop adaptations, the Adv-Soft-SPIBB algorithms, and show that they are provably safe.
1 code implementation • 30 Apr 2022 • Nikolaos Evangelou, Felix Dietrich, Juan M. Bello-Rivas, Alex Yeh, Rachel Stein, Michael A. Bevan, Ioannis G. Kevrekidis
We construct a reduced, data-driven, parameter dependent effective Stochastic Differential Equation (eSDE) for electric-field mediated colloidal crystallization using data obtained from Brownian Dynamics Simulations.
no code implementations • 26 Apr 2022 • Nikolaos Evangelou, Felix Dietrich, Eliodoro Chiavazzo, Daniel Lehmberg, Marina Meila, Ioannis G. Kevrekidis
A second round of Diffusion Maps on those latent coordinates allows the approximation of the reduced dynamical models.
1 code implementation • 28 Jan 2022 • Philipp Scholl, Felix Dietrich, Clemens Otte, Steffen Udluft
Safe Policy Improvement (SPI) aims at provable guarantees that a learned policy is at least approximately as good as a given baseline policy.
no code implementations • 13 Oct 2021 • Nikolaos Evangelou, Noah J. Wichrowski, George A. Kevrekidis, Felix Dietrich, Mahdi Kooshkbaghi, Sarah McFann, Ioannis G. Kevrekidis
We present a data-driven approach to characterizing nonidentifiability of a model's parameters and illustrate it through dynamic as well as steady kinetic models.
no code implementations • 5 Oct 2021 • Felix Dietrich, Juan M. Bello-Rivas, Ioannis G. Kevrekidis
We discuss the correspondence between Gaussian process regression and Geometric Harmonics, two similar kernel-based methods that are typically used in different contexts.
no code implementations • 29 Jul 2021 • Yubin Lu, Romit Maulik, Ting Gao, Felix Dietrich, Ioannis G. Kevrekidis, Jinqiao Duan
Specifically, the learned map is a multivariate normalizing flow that deforms the support of the reference density to the support of each and every density snapshot in time.
2 code implementations • 10 Jun 2021 • Felix Dietrich, Alexei Makeev, George Kevrekidis, Nikolaos Evangelou, Tom Bertalan, Sebastian Reich, Ioannis G. Kevrekidis
We identify effective stochastic differential equations (SDE) for coarse observables of fine-grained particle- or agent-based simulations; these SDE then provide useful coarse surrogate models of the fine scale dynamics.
2 code implementations • 4 May 2021 • Yue Guo, Felix Dietrich, Tom Bertalan, Danimir T. Doncevic, Manuel Dahmen, Ioannis G. Kevrekidis, Qianxiao Li
As a case study, we develop a machine learning approach that automatically learns effective solvers for initial value problems in the form of ordinary differential equations (ODEs), based on the Runge-Kutta (RK) integrator architecture.
no code implementations • 23 Dec 2020 • Felix P. Kemeth, Tom Bertalan, Thomas Thiem, Felix Dietrich, Sung Joon Moon, Carlo R. Laing, Ioannis G. Kevrekidis
These coordinates then serve as an emergent space in which to learn predictive models in the form of partial differential equations (PDEs) for the collective description of the coupled-agent system.
no code implementations • 10 Jul 2020 • Tom Bertalan, Felix Dietrich, Ioannis G. Kevrekidis
We propose to test, and when possible establish, an equivalence between two different artificial neural networks by attempting to construct a data-driven transformation between them, using manifold-learning techniques.
no code implementations • 15 Apr 2020 • Erez Peterfreund, Ofir Lindenbaum, Felix Dietrich, Tom Bertalan, Matan Gavish, Ioannis G. Kevrekidis, Ronald R. Coifman
We propose a deep-learning based method for obtaining standardized data coordinates from scientific measurements. Data observations are modeled as samples from an unknown, non-linear deformation of an underlying Riemannian manifold, which is parametrized by a few normalized latent variables.
no code implementations • 9 Apr 2020 • Felix Dietrich, Or Yair, Rotem Mulayoff, Ronen Talmon, Ioannis G. Kevrekidis
We show analytically that our method is guaranteed to provide a set of orthogonal functions that are as jointly smooth as possible, ordered by increasing Dirichlet energy from the smoothest to the least smooth.
no code implementations • 3 Jun 2019 • Or Yair, Felix Dietrich, Ronen Talmon, Ioannis G. Kevrekidis
We model the difference between two domains by a diffeomorphism and use the polar factorization theorem to claim that OT is indeed optimal for DA in a well-defined sense, up to a volume preserving map.
no code implementations • 16 Dec 2018 • Seungjoon Lee, Felix Dietrich, George E. Karniadakis, Ioannis G. Kevrekidis
In this paper, we will explore mathematical algorithms for multifidelity information fusion that use such an approach towards improving the representation of the high-fidelity function with only a few training data points.