Search Results for author: Robert Scheichl

Found 6 papers, 4 papers with code

Deep importance sampling using tensor trains with application to a priori and a posteriori rare event estimation

no code implementations5 Sep 2022 Tiangang Cui, Sergey Dolgov, Robert Scheichl

We approximate the optimal importance distribution in a general importance sampling problem as the pushforward of a reference distribution under a composition of order-preserving transformations, in which each transformation is formed by a squared tensor-train decomposition.

Bayesian Inference

Multilevel Delayed Acceptance MCMC with an Adaptive Error Model in PyMC3

no code implementations10 Dec 2020 Mikkel B. Lykkegaard, Grigorios Mingas, Robert Scheichl, Colin Fox, Tim J. Dodwell

Uncertainty Quantification through Markov Chain Monte Carlo (MCMC) can be prohibitively expensive for target probability densities with expensive likelihood functions, for instance when the evaluation it involves solving a Partial Differential Equation (PDE), as is the case in a wide range of engineering applications.

Probabilistic Programming Computation

Multilevel Monte Carlo for quantum mechanics on a lattice

1 code implementation7 Aug 2020 Karl Jansen, Eike Hermann Müller, Robert Scheichl

This paper discusses hierarchical sampling methods to tame this growth in autocorrelations.

High Energy Physics - Lattice Numerical Analysis Numerical Analysis Computational Physics 81-08, 81T25, 65Y20, 60J22 F.2; J.2

HINT: Hierarchical Invertible Neural Transport for Density Estimation and Bayesian Inference

1 code implementation25 May 2019 Jakob Kruse, Gianluca Detommaso, Ullrich Köthe, Robert Scheichl

Many recent invertible neural architectures are based on coupling block designs where variables are divided in two subsets which serve as inputs of an easily invertible (usually affine) triangular transformation.

Bayesian Inference Density Estimation

Approximation and sampling of multivariate probability distributions in the tensor train decomposition

1 code implementation2 Oct 2018 Sergey Dolgov, Karim Anaya-Izquierdo, Colin Fox, Robert Scheichl

We find that the importance-weight corrected quasi-Monte Carlo quadrature performs best in all computed examples, and is orders-of-magnitude more efficient than DRAM across a wide range of approximation accuracies and sample sizes.

Numerical Analysis Probability Statistics Theory Statistics Theory 65D15, 65D32, 65C05, 65C40, 65C60, 62F15, 15A69, 15A23

A Stein variational Newton method

1 code implementation NeurIPS 2018 Gianluca Detommaso, Tiangang Cui, Alessio Spantini, Youssef Marzouk, Robert Scheichl

Stein variational gradient descent (SVGD) was recently proposed as a general purpose nonparametric variational inference algorithm [Liu & Wang, NIPS 2016]: it minimizes the Kullback-Leibler divergence between the target distribution and its approximation by implementing a form of functional gradient descent on a reproducing kernel Hilbert space.

Variational Inference

Cannot find the paper you are looking for? You can Submit a new open access paper.