Search Results for author: Alessandro Barp

Found 13 papers, 2 papers with code

Deep Learning as Ricci Flow

1 code implementation22 Apr 2024 Anthony Baptista, Alessandro Barp, Tapabrata Chakraborti, Chris Harbron, Ben D. MacArthur, Christopher R. S. Banerji

To illustrate this idea, we present a computational framework to quantify the geometric changes that occur as data passes through successive layers of a DNN, and use this framework to motivate a notion of `global Ricci network flow' that can be used to assess a DNN's ability to disentangle complex data geometries to solve classification problems.

Improving embedding of graphs with missing data by soft manifolds

no code implementations29 Nov 2023 Andrea Marinoni, Pietro Lio', Alessandro Barp, Christian Jutten, Mark Girolami

The reliability of graph embeddings directly depends on how much the geometry of the continuous space matches the graph structure.

Graph Embedding

Warped geometric information on the optimisation of Euclidean functions

no code implementations16 Aug 2023 Marcelo Hartmann, Bernardo Williams, Hanlin Yu, Mark Girolami, Alessandro Barp, Arto Klami

We use Riemannian geometry notions to redefine the optimisation problem of a function on the Euclidean space to a Riemannian manifold with a warped metric, and then find the function's optimum along this manifold.

Controlling Moments with Kernel Stein Discrepancies

no code implementations10 Nov 2022 Heishiro Kanagawa, Alessandro Barp, Arthur Gretton, Lester Mackey

Kernel Stein discrepancies (KSDs) measure the quality of a distributional approximation and can be computed even when the target density has an intractable normalizing constant.

Targeted Separation and Convergence with Kernel Discrepancies

no code implementations26 Sep 2022 Alessandro Barp, Carl-Johann Simon-Gabriel, Mark Girolami, Lester Mackey

Maximum mean discrepancies (MMDs) like the kernel Stein discrepancy (KSD) have grown central to a wide range of applications, including hypothesis testing, sampler selection, distribution approximation, and variational inference.

Variational Inference

Geometric Methods for Sampling, Optimisation, Inference and Adaptive Agents

no code implementations20 Mar 2022 Alessandro Barp, Lancelot Da Costa, Guilherme França, Karl Friston, Mark Girolami, Michael I. Jordan, Grigorios A. Pavliotis

In this chapter, we identify fundamental geometric structures that underlie the problems of sampling, optimisation, inference and adaptive decision-making.

counterfactual Decision Making

A Unifying and Canonical Description of Measure-Preserving Diffusions

no code implementations6 May 2021 Alessandro Barp, So Takao, Michael Betancourt, Alexis Arnaudon, Mark Girolami

A complete recipe of measure-preserving diffusions in Euclidean space was recently derived unifying several MCMC algorithms into a single framework.

Metrizing Weak Convergence with Maximum Mean Discrepancies

no code implementations16 Jun 2020 Carl-Johann Simon-Gabriel, Alessandro Barp, Bernhard Schölkopf, Lester Mackey

More precisely, we prove that, on a locally compact, non-compact, Hausdorff space, the MMD of a bounded continuous Borel measurable kernel k, whose reproducing kernel Hilbert space (RKHS) functions vanish at infinity, metrizes the weak convergence of probability measures if and only if k is continuous and integrally strictly positive definite (i. s. p. d.)

Minimum Stein Discrepancy Estimators

no code implementations NeurIPS 2019 Alessandro Barp, Francois-Xavier Briol, Andrew B. Duncan, Mark Girolami, Lester Mackey

We provide a unifying perspective of these techniques as minimum Stein discrepancy estimators, and use this lens to design new diffusion kernel Stein discrepancy (DKSD) and diffusion score matching (DSM) estimators with complementary strengths.

Statistical Inference for Generative Models with Maximum Mean Discrepancy

no code implementations13 Jun 2019 Francois-Xavier Briol, Alessandro Barp, Andrew B. Duncan, Mark Girolami

While likelihood-based inference and its variants provide a statistically efficient and widely applicable approach to parametric inference, their application to models involving intractable likelihoods poses challenges.

Stein Point Markov Chain Monte Carlo

1 code implementation9 May 2019 Wilson Ye Chen, Alessandro Barp, François-Xavier Briol, Jackson Gorham, Mark Girolami, Lester Mackey, Chris. J. Oates

Stein Points are a class of algorithms for this task, which proceed by sequentially minimising a Stein discrepancy between the empirical measure and the target and, hence, require the solution of a non-convex optimisation problem to obtain each new point.

Bayesian Inference

Geometry and Dynamics for Markov Chain Monte Carlo

no code implementations8 May 2017 Alessandro Barp, Francois-Xavier Briol, Anthony D. Kennedy, Mark Girolami

The aim of this review is to provide a comprehensive introduction to the geometric tools used in Hamiltonian Monte Carlo at a level accessible to statisticians, machine learners and other users of the methodology with only a basic understanding of Monte Carlo methods.

Cannot find the paper you are looking for? You can Submit a new open access paper.