Search Results for author: Felix Dietrich

Found 20 papers, 6 papers with code

Multi-fidelity Gaussian process surrogate modeling for regression problems in physics

no code implementations18 Apr 2024 Kislaya Ravi, Vladyslav Fediukov, Felix Dietrich, Tobias Neckel, Fabian Buse, Michael Bergmann, Hans-Joachim Bungartz

One of the main challenges in surrogate modeling is the limited availability of data due to resource constraints associated with computationally expensive simulations.

regression

Systematic construction of continuous-time neural networks for linear dynamical systems

1 code implementation24 Mar 2024 Chinmay Datar, Adwait Datar, Felix Dietrich, Wil Schilders

Discovering a suitable neural network architecture for modeling complex dynamical systems poses a formidable challenge, often involving extensive trial and error and navigation through a high-dimensional hyper-parameter space.

Gappy local conformal auto-encoders for heterogeneous data fusion: in praise of rigidity

no code implementations20 Dec 2023 Erez Peterfreund, Iryna Burak, Ofir Lindenbaum, Jim Gimlett, Felix Dietrich, Ronald R. Coifman, Ioannis G. Kevrekidis

Fusing measurements from multiple, heterogeneous, partial sources, observing a common object or process, poses challenges due to the increasing availability of numbers and types of sensors.

Local Distortion

Data-driven modelling of brain activity using neural networks, Diffusion Maps, and the Koopman operator

no code implementations24 Apr 2023 Ioannis K. Gallos, Daniel Lehmberg, Felix Dietrich, Constantinos Siettos

Importantly, we show that the proposed Koopman operator approach provides, for any practical purposes, equivalent results to the FNN-GH approach, thus bypassing the need to train a non-linear map and to use GH to extrapolate predictions in the ambient fMRI space; one can use instead the low-frequency truncation of the DMs function space of L^2-integrable functions, to predict the entire list of coordinate functions in the fMRI space and to solve the pre-image problem.

Time Series

A Recursively Recurrent Neural Network (R2N2) Architecture for Learning Iterative Algorithms

no code implementations22 Nov 2022 Danimir T. Doncevic, Alexander Mitsos, Yue Guo, Qianxiao Li, Felix Dietrich, Manuel Dahmen, Ioannis G. Kevrekidis

Meta-learning of numerical algorithms for a given task consists of the data-driven identification and adaptation of an algorithmic structure and the associated hyperparameters.

Inductive Bias Meta-Learning

Safe Policy Improvement Approaches and their Limitations

1 code implementation1 Aug 2022 Philipp Scholl, Felix Dietrich, Clemens Otte, Steffen Udluft

Based on this finding, we develop adaptations, the Adv-Soft-SPIBB algorithms, and show that they are provably safe.

Learning Effective SDEs from Brownian Dynamics Simulations of Colloidal Particles

1 code implementation30 Apr 2022 Nikolaos Evangelou, Felix Dietrich, Juan M. Bello-Rivas, Alex Yeh, Rachel Stein, Michael A. Bevan, Ioannis G. Kevrekidis

We construct a reduced, data-driven, parameter dependent effective Stochastic Differential Equation (eSDE) for electric-field mediated colloidal crystallization using data obtained from Brownian Dynamics Simulations.

Dimensionality Reduction

Safe Policy Improvement Approaches on Discrete Markov Decision Processes

1 code implementation28 Jan 2022 Philipp Scholl, Felix Dietrich, Clemens Otte, Steffen Udluft

Safe Policy Improvement (SPI) aims at provable guarantees that a learned policy is at least approximately as good as a given baseline policy.

On the Parameter Combinations That Matter and on Those That do Not

no code implementations13 Oct 2021 Nikolaos Evangelou, Noah J. Wichrowski, George A. Kevrekidis, Felix Dietrich, Mahdi Kooshkbaghi, Sarah McFann, Ioannis G. Kevrekidis

We present a data-driven approach to characterizing nonidentifiability of a model's parameters and illustrate it through dynamic as well as steady kinetic models.

On the Correspondence between Gaussian Processes and Geometric Harmonics

no code implementations5 Oct 2021 Felix Dietrich, Juan M. Bello-Rivas, Ioannis G. Kevrekidis

We discuss the correspondence between Gaussian process regression and Geometric Harmonics, two similar kernel-based methods that are typically used in different contexts.

Bayesian Optimization Dimensionality Reduction +2

Learning the temporal evolution of multivariate densities via normalizing flows

no code implementations29 Jul 2021 Yubin Lu, Romit Maulik, Ting Gao, Felix Dietrich, Ioannis G. Kevrekidis, Jinqiao Duan

Specifically, the learned map is a multivariate normalizing flow that deforms the support of the reference density to the support of each and every density snapshot in time.

Learning effective stochastic differential equations from microscopic simulations: linking stochastic numerics to deep learning

2 code implementations10 Jun 2021 Felix Dietrich, Alexei Makeev, George Kevrekidis, Nikolaos Evangelou, Tom Bertalan, Sebastian Reich, Ioannis G. Kevrekidis

We identify effective stochastic differential equations (SDE) for coarse observables of fine-grained particle- or agent-based simulations; these SDE then provide useful coarse surrogate models of the fine scale dynamics.

Numerical Integration

Personalized Algorithm Generation: A Case Study in Learning ODE Integrators

2 code implementations4 May 2021 Yue Guo, Felix Dietrich, Tom Bertalan, Danimir T. Doncevic, Manuel Dahmen, Ioannis G. Kevrekidis, Qianxiao Li

As a case study, we develop a machine learning approach that automatically learns effective solvers for initial value problems in the form of ordinary differential equations (ODEs), based on the Runge-Kutta (RK) integrator architecture.

Meta-Learning

Learning emergent PDEs in a learned emergent space

no code implementations23 Dec 2020 Felix P. Kemeth, Tom Bertalan, Thomas Thiem, Felix Dietrich, Sung Joon Moon, Carlo R. Laing, Ioannis G. Kevrekidis

These coordinates then serve as an emergent space in which to learn predictive models in the form of partial differential equations (PDEs) for the collective description of the coupled-agent system.

Transformations between deep neural networks

no code implementations10 Jul 2020 Tom Bertalan, Felix Dietrich, Ioannis G. Kevrekidis

We propose to test, and when possible establish, an equivalence between two different artificial neural networks by attempting to construct a data-driven transformation between them, using manifold-learning techniques.

Transfer Learning

LOCA: LOcal Conformal Autoencoder for standardized data coordinates

no code implementations15 Apr 2020 Erez Peterfreund, Ofir Lindenbaum, Felix Dietrich, Tom Bertalan, Matan Gavish, Ioannis G. Kevrekidis, Ronald R. Coifman

We propose a deep-learning based method for obtaining standardized data coordinates from scientific measurements. Data observations are modeled as samples from an unknown, non-linear deformation of an underlying Riemannian manifold, which is parametrized by a few normalized latent variables.

Spectral Discovery of Jointly Smooth Features for Multimodal Data

no code implementations9 Apr 2020 Felix Dietrich, Or Yair, Rotem Mulayoff, Ronen Talmon, Ioannis G. Kevrekidis

We show analytically that our method is guaranteed to provide a set of orthogonal functions that are as jointly smooth as possible, ordered by increasing Dirichlet energy from the smoothest to the least smooth.

Domain Adaptation with Optimal Transport on the Manifold of SPD matrices

no code implementations3 Jun 2019 Or Yair, Felix Dietrich, Ronen Talmon, Ioannis G. Kevrekidis

We model the difference between two domains by a diffeomorphism and use the polar factorization theorem to claim that OT is indeed optimal for DA in a well-defined sense, up to a volume preserving map.

Brain Computer Interface Domain Adaptation

Linking Gaussian Process regression with data-driven manifold embeddings for nonlinear data fusion

no code implementations16 Dec 2018 Seungjoon Lee, Felix Dietrich, George E. Karniadakis, Ioannis G. Kevrekidis

In this paper, we will explore mathematical algorithms for multifidelity information fusion that use such an approach towards improving the representation of the high-fidelity function with only a few training data points.

Gaussian Processes regression

Cannot find the paper you are looking for? You can Submit a new open access paper.