no code implementations • 30 May 2025 • Ricardo Baptista, Andrew M. Stuart, Son Tran
The methodology is typically framed as the identification of a set of encoders, one for each modality, that align representations within a common latent space.
no code implementations • 17 May 2025 • Ricardo Baptista, Panagiota Birmpa, Markos A. Katsoulakis, Luc Rey-Bellet, Benjamin J. Zhang
Building on the Benamou-Brenier dynamic formulation of optimal transport cost, we also establish a dynamic formulation for proximal OT divergences.
2 code implementations • 24 Apr 2025 • Eviatar Bach, Ricardo Baptista, Edoardo Calvello, Bohan Chen, Andrew Stuart
The ensemble approach uses empirical measures as input to the MNM and is implemented using the set transformer, which is invariant to ensemble permutation and allows for different ensemble sizes.
1 code implementation • 18 Mar 2025 • Sarah Liaw, Rebecca Morrison, Youssef Marzouk, Ricardo Baptista
Identifying the Markov properties or conditional independencies of a collection of random variables is a fundamental task in statistics for modeling and inference.
1 code implementation • 28 Jan 2025 • Ricardo Baptista, Edoardo Calvello, Matthieu Darcy, Houman Owhadi, Andrew M. Stuart, Xianjin Yang
We consider the use of Gaussian Processes (GPs) or Neural Networks (NNs) to numerically approximate the solutions to nonlinear partial differential equations (PDEs) with rough forcing or source terms, which commonly arise as pathwise solutions to stochastic PDEs.
1 code implementation • 27 Jan 2025 • Ricardo Baptista, Agnimitra Dasgupta, Nikola B. Kovachki, Assad Oberai, Andrew M. Stuart
Diffusion models have emerged as a powerful framework for generative modeling.
no code implementations • 13 Nov 2024 • Fengyi Li, Ricardo Baptista, Youssef Marzouk
We then address the estimation of EIG in high dimensions, by deriving gradient-based upper bounds on the mutual information lost by projecting the parameters and/or observations to lower-dimensional subspaces.
no code implementations • 11 Nov 2024 • Ricardo Baptista, Aram-Alexandre Pooladian, Michael Brennan, Youssef Marzouk, Jonathan Niles-Weed
Conditional simulation is a fundamental task in statistical modeling: Generate samples from the conditionals given finitely many data points from a joint distribution.
no code implementations • 25 Oct 2024 • Ricardo Baptista, Michael Brennan, Youssef Marzouk
Yet these matrices require gradients or even Hessians of the log-likelihood, excluding the purely data-driven setting and many problems of simulation-based inference.
no code implementations • 14 Oct 2024 • Eviatar Bach, Ricardo Baptista, Daniel Sanz-Alonso, Andrew Stuart
The aim of these notes is to demonstrate the potential for ideas in machine learning to impact on the fields of inverse problems and data assimilation.
no code implementations • 30 Sep 2024 • Hongkai Zheng, Wenda Chu, Austin Wang, Nikola Kovachki, Ricardo Baptista, Yisong Yue
This reliance poses a substantial limitation that restricts their use in a wide range of problems where such information is unavailable, such as in many scientific applications.
1 code implementation • 13 Jul 2024 • Ricardo Baptista, Eliza O'Reilly, Yangxinyu Xie
We propose a computationally efficient algorithm for gradient-based linear dimension reduction and high-dimensional regression.
no code implementations • 26 Jun 2024 • Eviatar Bach, Ricardo Baptista, Enoch Luk, Andrew Stuart
Filtering - the task of estimating the conditional distribution for states of a dynamical system given partial and noisy observations - is important in many areas of science and engineering, including weather and climate prediction.
1 code implementation • 19 Jun 2024 • Qiao Chen, Elise Arnaud, Ricardo Baptista, Olivier Zahm
We introduce a new method to jointly reduce the dimension of the input and output space of a function between high-dimensional spaces.
no code implementations • 18 Jun 2024 • Berthy T. Feng, Ricardo Baptista, Katherine L. Bouman
We learn an approximate mirror map that transforms data into an unconstrained space and a corresponding approximate inverse that maps data back to the constraint set.
1 code implementation • 28 Nov 2023 • Théo Bourdais, Pau Batlle, Xianjin Yang, Ricardo Baptista, Nicolas Rouquette, Houman Owhadi
Type 1: Approximate an unknown function given input/output data.
2 code implementations • 25 Oct 2023 • Zheyu Oliver Wang, Ricardo Baptista, Youssef Marzouk, Lars Ruthotto, Deepanshu Verma
Our static approach approximates the map as the gradient of a partially input-convex neural network.
1 code implementation • 12 Oct 2023 • Mathieu Le Provost, Ricardo Baptista, Jeff D. Eldredge, Youssef Marzouk
In these settings, the Kalman filter and its ensemble version - the ensemble Kalman filter (EnKF) - that have been designed under Gaussian assumptions result in degraded performance.
1 code implementation • 9 Jul 2023 • Jason Alfonso, Ricardo Baptista, Anupam Bhakta, Noam Gal, Alfin Hou, Isa Lyubimova, Daniel Pocklington, Josef Sajonz, Giulio Trigila, Ryan Tsai
Sampling conditional distributions is a fundamental task for Bayesian inference and density estimation.
1 code implementation • NeurIPS 2023 • Zhong Yi Wan, Ricardo Baptista, Yi-fan Chen, John Anderson, Anudhyan Boral, Fei Sha, Leonardo Zepeda-Núñez
Moreover, our procedure correctly matches the statistics of physical quantities, even when the low-frequency content of the inputs and outputs do not match, a crucial but difficult-to-satisfy assumption needed by current state-of-the-art alternatives.
no code implementations • 14 Feb 2023 • Jae Hyun Lim, Nikola B. Kovachki, Ricardo Baptista, Christopher Beckham, Kamyar Azizzadenesheli, Jean Kossaifi, Vikram Voleti, Jiaming Song, Karsten Kreis, Jan Kautz, Christopher Pal, Arash Vahdat, Anima Anandkumar
They consist of a forward process that perturbs input data with Gaussian white noise and a reverse process that learns a score function to generate samples by denoising.
1 code implementation • 31 Oct 2022 • Maximilian Ramgraber, Ricardo Baptista, Dennis McLaughlin, Youssef Marzouk
A companion paper (Ramgraber et al., 2023) explores the implementation of nonlinear ensemble transport smoothers in greater depth.
1 code implementation • 31 Oct 2022 • Maximilian Ramgraber, Ricardo Baptista, Dennis McLaughlin, Youssef Marzouk
Smoothing is a specialized form of Bayesian inference for state-space models that characterizes the posterior distribution of a collection of states given an associated sequence of observations.
no code implementations • 22 Jun 2022 • Ricardo Baptista, Lianghao Cao, Joshua Chen, Omar Ghattas, Fengyi Li, Youssef M. Marzouk, J. Tinsley Oden
We tackle this challenging Bayesian inference problem using a likelihood-free approach based on measure transport together with the construction of summary statistics for the image data.
2 code implementations • 10 Mar 2022 • Mathieu Le Provost, Ricardo Baptista, Youssef Marzouk, Jeff D. Eldredge
We propose a regularization method for ensemble Kalman filtering (EnKF) with elliptic observation operators.
no code implementations • 8 Jul 2021 • Rebecca E Morrison, Ricardo Baptista, Estelle L Basor
For a multivariate normal distribution, the sparsity of the covariance and precision matrices encodes complete information about independence and conditional independence properties.
no code implementations • 8 Jan 2021 • Ricardo Baptista, Youssef Marzouk, Rebecca E. Morrison, Olivier Zahm
Undirected probabilistic graphical models represent the conditional dependencies, or Markov properties, of a collection of random variables.
1 code implementation • 22 Sep 2020 • Ricardo Baptista, Youssef Marzouk, Olivier Zahm
Transportation of measure provides a versatile approach for modeling complex probability distributions, with applications in density estimation, Bayesian inference, generative modeling, and beyond.
1 code implementation • 11 Jun 2020 • Ricardo Baptista, Bamdad Hosseini, Nikola B. Kovachki, Youssef Marzouk
We present a novel framework for conditional sampling of probability measures, using block triangular transport maps.
no code implementations • 30 Jun 2019 • Alessio Spantini, Ricardo Baptista, Youssef Marzouk
We consider filtering in high-dimensional non-Gaussian state-space models with intractable transition kernels, nonlinear and possibly chaotic dynamics, and sparse observations in space and time.
2 code implementations • ICML 2018 • Ricardo Baptista, Matthias Poloczek
The optimization of expensive-to-evaluate black-box functions over combinatorial structures is an ubiquitous task in machine learning, engineering and the natural sciences.
no code implementations • NeurIPS 2017 • Rebecca E. Morrison, Ricardo Baptista, Youssef Marzouk
We present an algorithm to identify sparse dependence structure in continuous and non-Gaussian probability distributions, given a corresponding set of data.