no code implementations • 5 Mar 2023 • Tiangang Cui, Sergey Dolgov, Olivier Zahm
We approximate the complicated target density by a composition of self-reinforced KR rearrangements, in which previously constructed KR rearrangements -- based on the same approximation ansatz -- are used to precondition the density approximation problem for building each new KR rearrangement.
no code implementations • 5 Sep 2022 • Tiangang Cui, Sergey Dolgov, Robert Scheichl
We approximate the optimal importance distribution in a general importance sampling problem as the pushforward of a reference distribution under a composition of order-preserving transformations, in which each transformation is formed by a squared tensor-train decomposition.
no code implementations • 5 Sep 2022 • Tiangang Cui, Zhongjian Wang, Zhiwen Zhang
We first formulate the solution of non-Newtonian ice flow model into the minimizer of a variational integral with boundary constraints.
1 code implementation • 8 Jun 2021 • Tiangang Cui, Sergey Dolgov, Olivier Zahm
We present a novel offline-online method to mitigate the computational burden of the characterization of posterior random variables in statistical learning.
no code implementations • 26 Jan 2021 • Lingbin Bian, Tiangang Cui, B. T. Thomas Yeo, Alex Fornito, Adeel Razi, Jonathan Keith
Brain function relies on a precisely coordinated and dynamic balance between the functional integration and segregation of distinct neural systems.
no code implementations • 14 Jul 2020 • Tiangang Cui, Sergey Dolgov
The recent surge of transport maps offers a mathematical foundation and new insights for tackling this challenge by coupling intractable random variables with tractable reference random variables.
no code implementations • 15 Feb 2020 • Johnathan Bardsley, Tiangang Cui
In this work, we aim to develop scalable optimization-based Markov chain Monte Carlo (MCMC) methods for solving hierarchical Bayesian inverse problems with nonlinear parameter-to-observable maps and a broader class of hyperparameters.
1 code implementation • 23 Jan 2019 • Gianluca Detommaso, Hanne Hoitzing, Tiangang Cui, Ardavan Alamir
Bayesian online changepoint detection (BOCPD) (Adams & MacKay, 2007) offers a rigorous and viable way to identify changepoints in complex systems.
1 code implementation • NeurIPS 2018 • Gianluca Detommaso, Tiangang Cui, Alessio Spantini, Youssef Marzouk, Robert Scheichl
Stein variational gradient descent (SVGD) was recently proposed as a general purpose nonparametric variational inference algorithm [Liu & Wang, NIPS 2016]: it minimizes the Kullback-Leibler divergence between the target distribution and its approximation by implementing a form of functional gradient descent on a reproducing kernel Hilbert space.