The data-based discovery of effective, coarse-grained (CG) models of high-dimensional dynamical systems presents a unique challenge in computational physics and particularly in the context of multiscale problems.
Given (small amounts of) time-series' data from a high-dimensional, fine-grained, multiscale dynamical system, we propose a generative framework for learning an effective, lower-dimensional, coarse-grained dynamical model that is predictive of the fine-grained system's long-term evolution but also of its behavior under different initial conditions.
We advocate a probabilistic (Bayesian) model in which equalities that are available from the physics (e. g. residuals, conservation laws) can be introduced as virtual observables and can provide additional information through the likelihood.
Rather than separating model learning from the data-generation procedure - the latter relies on simulating atomistic motions governed by force fields - we query the atomistic force field at sample configurations proposed by the predictive coarse-grained model.
Data-based discovery of effective, coarse-grained (CG) models of high-dimensional dynamical systems presents a unique challenge in computational physics and particularly in the context of multiscale problems.
The automated construction of coarse-grained models represents a pivotal component in computer simulation of physical systems and is a key enabler in various analysis and design tasks related to uncertainty quantification.
Surrogate modeling and uncertainty quantification tasks for PDE systems are most often considered as supervised learning problems where input and output data pairs are used for training.
In this work, we formulate the discovery of CVs as a Bayesian inference problem and consider the CVs as hidden generators of the full-atomistic trajectory.
Direct numerical simulation of Stokes flow through an impermeable, rigid body matrix by finite elements requires meshes fine enough to resolve the pore-size scale and is thus a computationally expensive task.
This recasts the solution of both forward and inverse problems as probabilistic inference tasks where the problem's state variables should not only be compatible with the data but also with the governing equations as well.
Both components are represented with latent variables in a probabilistic graphical model and are simultaneously trained using Stochastic Variational Inference methods.
We discuss a Bayesian formulation to coarse-graining (CG) of PDEs where the coefficients (e. g. material parameters) exhibit random, fine scale variability.
The solution of such problems is hindered not only by the usual difficulties encountered in UQ tasks (e. g. the high computational cost of each forward simulation, the large number of random variables) but also by the need to solve a nonlinear optimization problem involving large numbers of design variables and potentially constraints.