1 code implementation • 26 Jun 2024 • Mathieu Fourment, Matthew Macaulay, Christiaan J Swanepoel, Xiang Ji, Marc A Suchard, Frederick A Matsen IV
Furthermore, we explore the use of the forward KL divergence as an optimizing criterion for variational inference, which can handle discontinuous and non-differentiable models.
1 code implementation • 21 Sep 2023 • Matthew Macaulay, Mathieu Fourment
We present soft-NJ, a differentiable version of neighbour-joining that enables gradient-based optimisation over the space of trees.
1 code implementation • 9 Nov 2022 • Christiaan Swanepoel, Mathieu Fourment, Xiang Ji, Hassan Nasif, Marc A Suchard, Frederick A Matsen IV, Alexei Drummond
Probabilistic programming frameworks are powerful tools for statistical modelling and inference.
2 code implementations • 3 Nov 2022 • Mathieu Fourment, Christiaan J. Swanepoel, Jared G. Galloway, Xiang Ji, Karthik Gangavarapu, Marc A. Suchard, Frederick A. Matsen IV
Gradients of probabilistic model likelihoods with respect to their parameters are essential for modern computational statistics and machine learning.
1 code implementation • 16 Jun 2022 • Matthew Macaulay, Aaron E. Darling, Mathieu Fourment
In this paper, we embed genomic sequences into hyperbolic space and perform hyperbolic Markov Chain Monte Carlo for Bayesian inference.
1 code implementation • 28 Nov 2018 • Mathieu Fourment, Andrew F. Magee, Chris Whidden, Arman Bilge, Frederick A. Matsen IV, Vladimir N. Minin
The marginal likelihood of a model is a key quantity for assessing the evidence provided by the data in support of a model.