Search Results for author: Benjamin Peherstorfer

Found 27 papers, 12 papers with code

Sequential-in-time training of nonlinear parametrizations for solving time-dependent partial differential equations

no code implementations1 Apr 2024 huan zhang, Yifan Chen, Eric Vanden-Eijnden, Benjamin Peherstorfer

Sequential-in-time methods solve a sequence of training problems to fit nonlinear parametrizations such as neural networks to approximate solution trajectories of partial differential equations over time.

CoLoRA: Continuous low-rank adaptation for reduced implicit neural modeling of parameterized partial differential equations

no code implementations22 Feb 2024 Jules Berman, Benjamin Peherstorfer

This work introduces reduced models based on Continuous Low Rank Adaptation (CoLoRA) that pre-train neural networks for a given partial differential equation and then continuously adapt low-rank weights in time to rapidly predict the evolution of solution fields at new physics parameters and new initial conditions.

Nonlinear embeddings for conserving Hamiltonians and other quantities with Neural Galerkin schemes

1 code implementation11 Oct 2023 Paul Schwerdtner, Philipp Schulze, Jules Berman, Benjamin Peherstorfer

This work focuses on the conservation of quantities such as Hamiltonians, mass, and momentum when solution fields of partial differential equations are approximated with nonlinear parametrizations such as deep networks.

Randomized Sparse Neural Galerkin Schemes for Solving Evolution Equations with Deep Networks

2 code implementations NeurIPS 2023 Jules Berman, Benjamin Peherstorfer

Training neural networks sequentially in time to approximate solution fields of time-dependent partial differential equations can be beneficial for preserving causality and other physics properties; however, the sequential-in-time training is numerically challenging because training errors quickly accumulate and amplify over time.

Multifidelity Covariance Estimation via Regression on the Manifold of Symmetric Positive Definite Matrices

no code implementations23 Jul 2023 Aimee Maurais, Terrence Alsup, Benjamin Peherstorfer, Youssef Marzouk

We introduce a multifidelity estimator of covariance matrices formulated as the solution to a regression problem on the manifold of symmetric positive definite matrices.

Metric Learning regression

Coupling parameter and particle dynamics for adaptive sampling in Neural Galerkin schemes

no code implementations27 Jun 2023 Yuxiao Wen, Eric Vanden-Eijnden, Benjamin Peherstorfer

Training nonlinear parametrizations such as deep neural networks to numerically approximate solutions of partial differential equations is often based on minimizing a loss that includes the residual, which is analytically available in limited settings only.

Rank-Minimizing and Structured Model Inference

no code implementations19 Feb 2023 Pawan Goyal, Benjamin Peherstorfer, Peter Benner

While extracting information from data with machine learning plays an increasingly important role, physical laws and other first principles continue to provide critical insights about systems and processes of interest in science and engineering.

Multi-Fidelity Covariance Estimation in the Log-Euclidean Geometry

1 code implementation31 Jan 2023 Aimee Maurais, Terrence Alsup, Benjamin Peherstorfer, Youssef Marzouk

We introduce a multi-fidelity estimator of covariance matrices that employs the log-Euclidean geometry of the symmetric positive-definite manifold.

Metric Learning

Further analysis of multilevel Stein variational gradient descent with an application to the Bayesian inference of glacier ice models

no code implementations6 Dec 2022 Terrence Alsup, Tucker Hartland, Benjamin Peherstorfer, Noemi Petra

Multilevel Stein variational gradient descent is a method for particle-based variational inference that leverages hierarchies of surrogate target distributions with varying costs and fidelity to computationally speed up inference.

Bayesian Inference Variational Inference

Context-aware learning of hierarchies of low-fidelity models for multi-fidelity uncertainty quantification

1 code implementation20 Nov 2022 Ionut-Gabriel Farcas, Benjamin Peherstorfer, Tobias Neckel, Frank Jenko, Hans-Joachim Bungartz

When training low-fidelity models, the proposed approach takes into account the context in which the learned low-fidelity models will be used, namely for variance reduction in Monte Carlo estimation, which allows it to find optimal trade-offs between training and sampling to minimize upper bounds of the mean-squared errors of the estimators for given computational budgets.

Uncertainty Quantification

Context-aware controller inference for stabilizing dynamical systems from scarce data

no code implementations22 Jul 2022 Steffen W. R. Werner, Benjamin Peherstorfer

This work introduces a data-driven control approach for stabilizing high-dimensional dynamical systems from scarce data.

Neural Galerkin Schemes with Active Learning for High-Dimensional Evolution Equations

1 code implementation2 Mar 2022 Joan Bruna, Benjamin Peherstorfer, Eric Vanden-Eijnden

Neural Galerkin schemes build on the Dirac-Frenkel variational principle to train networks by minimizing the residual sequentially over time, which enables adaptively collecting new training data in a self-informed manner that is guided by the dynamics described by the partial differential equations.

Active Learning Vocal Bursts Intensity Prediction

On the sample complexity of stabilizing linear dynamical systems from data

no code implementations28 Feb 2022 Steffen W. R. Werner, Benjamin Peherstorfer

Learning controllers from data for stabilizing dynamical systems typically follows a two step process of first identifying a model and then constructing a controller based on the identified model.

An Extensible Benchmark Suite for Learning to Simulate Physical Systems

1 code implementation9 Aug 2021 Karl Otness, Arvi Gjoka, Joan Bruna, Daniele Panozzo, Benjamin Peherstorfer, Teseo Schneider, Denis Zorin

Simulating physical systems is a core component of scientific computing, encompassing a wide range of physical domains and applications.

Computational Efficiency

Active operator inference for learning low-dimensional dynamical-system models from noisy data

no code implementations20 Jul 2021 Wayne Isaac Tan Uy, Yuepeng Wang, Yuxiao Wen, Benjamin Peherstorfer

Furthermore, the connection between operator inference and projection-based model reduction enables bounding the mean-squared errors of predictions made with the learned models with respect to traditional reduced models.

Physics-informed regularization and structure preservation for learning stable reduced models from data with operator inference

no code implementations6 Jul 2021 Nihar Sawant, Boris Kramer, Benjamin Peherstorfer

Operator inference learns low-dimensional dynamical-system models with polynomial nonlinear terms from trajectories of high-dimensional physical systems (non-intrusive model reduction).

Multilevel Stein variational gradient descent with applications to Bayesian inverse problems

no code implementations5 Apr 2021 Terrence Alsup, Luca Venturi, Benjamin Peherstorfer

The proposed multilevel Stein variational gradient descent moves most of the iterations to lower, cheaper levels with the aim of requiring only a few iterations on the higher, more expensive levels when compared to the traditional, single-level Stein variational gradient descent variant that uses the highest-level distribution only.

Operator inference of non-Markovian terms for learning reduced models from partially observed state trajectories

1 code implementation1 Mar 2021 Wayne Isaac Tan Uy, Benjamin Peherstorfer

The core contributions of this work are a data sampling scheme to sample partially observed states from high-dimensional dynamical systems and a formulation of a regression problem to fit the non-Markovian reduced terms to the sampled states.

Context-aware surrogate modeling for balancing approximation and sampling costs in multi-fidelity importance sampling and Bayesian inverse problems

no code implementations22 Oct 2020 Terrence Alsup, Benjamin Peherstorfer

Thus, there is a trade-off between investing computational resources to improve the accuracy of surrogate models versus simply making more frequent recourse to expensive high-fidelity models; however, this trade-off is ignored by traditional modeling methods that construct surrogate models that are meant to replace high-fidelity models rather than being used together with high-fidelity models.

Depth separation for reduced deep networks in nonlinear model reduction: Distilling shock waves in nonlinear hyperbolic problems

no code implementations28 Jul 2020 Donsub Rim, Luca Venturi, Joan Bruna, Benjamin Peherstorfer

Classical reduced models are low-rank approximations using a fixed basis designed to achieve dimensionality reduction of large-scale systems.

Dimensionality Reduction

Probabilistic error estimation for non-intrusive reduced models learned from data of systems governed by linear parabolic partial differential equations

1 code implementation12 May 2020 Wayne Isaac Tan Uy, Benjamin Peherstorfer

This work derives a residual-based a posteriori error estimator for reduced models learned with non-intrusive model reduction from data of high-dimensional systems governed by linear parabolic partial differential equations with control inputs.

Operator inference for non-intrusive model reduction of systems with non-polynomial nonlinear terms

1 code implementation22 Feb 2020 Peter Benner, Pawan Goyal, Boris Kramer, Benjamin Peherstorfer, Karen Willcox

The proposed method learns operators for the linear and polynomially nonlinear dynamics via a least-squares problem, where the given non-polynomial terms are incorporated in the right-hand side.

Learning low-dimensional dynamical-system models from noisy frequency-response data with Loewner rational interpolation

no code implementations30 Sep 2019 Zlatko Drmač, Benjamin Peherstorfer

Loewner rational interpolation provides a versatile tool to learn low-dimensional dynamical-system models from frequency-response measurements.

Sampling low-dimensional Markovian dynamics for pre-asymptotically recovering reduced models from data with operator inference

2 code implementations29 Aug 2019 Benjamin Peherstorfer

Thus, the learned models are guaranteed to inherit the well-studied properties of reduced models from traditional model reduction.

Stabilizing discrete empirical interpolation via randomized and deterministic oversampling

1 code implementation30 Aug 2018 Benjamin Peherstorfer, Zlatko Drmač, Serkan Gugercin

Numerical experiments with synthetic and diffusion-reaction problems demonstrate the stability of oversampled empirical interpolation in the presence of noise.

Numerical Analysis

Cannot find the paper you are looking for? You can Submit a new open access paper.