no code implementations • 9 Feb 2023 • Kes Ward, Gaetano Romano, Idris Eckley, Paul Fearnhead
This is possible by using pruning ideas, which reduce the set of changepoint locations that need to be considered at time $T$ to approximately $\log T$.
no code implementations • 6 Feb 2023 • Gaetano Romano, Idris A Eckley, Paul Fearnhead
Thanks to functional pruning ideas, NP-FOCuS has a computational cost that is log-linear in the number of observations and is suitable for high-frequency data streams.
1 code implementation • 7 Nov 2022 • Jie Li, Paul Fearnhead, Piotr Fryzlewicz, Tengyao Wang
We show how to automatically generate new offline detection methods based on training a neural network.
1 code implementation • 28 Oct 2022 • Srshti Putcha, Christopher Nemeth, Paul Fearnhead
Stochastic gradient MCMC (SGMCMC) offers a scalable alternative to traditional MCMC, by constructing an unbiased estimate of the gradient of the log-posterior with a small, uniformly-weighted subsample of the data.
no code implementations • 19 May 2022 • Matthew Sutton, Robert Salomone, Augustin Chevallier, Paul Fearnhead
We show how PDMPs, and particularly the Zig-Zag sampler, can be implemented to sample from such an extended distribution.
no code implementations • 18 Feb 2022 • Augustin Chevallier, Frédéric Cazals, Paul Fearnhead
Computing the volume of a polytope in high dimensions is computationally challenging but has wide applications.
1 code implementation • NeurIPS 2023 • Gaetano Romano, Idris Eckley, Paul Fearnhead, Guillem Rigaill
Online algorithms for detecting a change in mean often involve using a moving window, or specifying the expected size of change.
1 code implementation • 22 Oct 2020 • Augustin Chevallier, Paul Fearnhead, Matthew Sutton
A new class of Markov chain Monte Carlo (MCMC) algorithms, based on simulating piecewise deterministic Markov processes (PDMPs), have recently shown great promise: they are non-reversible, can mix better than standard MCMC algorithms, and can use subsampling ideas to speed up computation in big data scenarios.
no code implementations • 4 Sep 2019 • Alexander T. M. Fisch, Idris A. Eckley, Paul Fearnhead
In recent years, there has been a growing interest in identifying anomalous structure within multivariate data streams.
1 code implementation • 16 Jul 2019 • Christopher Nemeth, Paul Fearnhead
In this paper, we focus on a particular class of scalable Monte Carlo algorithms, stochastic gradient Markov chain Monte Carlo (SGMCMC) which utilises data subsampling techniques to reduce the per-iteration cost of MCMC.
2 code implementations • 29 Jan 2019 • Christopher Aicher, Srshti Putcha, Christopher Nemeth, Paul Fearnhead, Emily B. Fox
We evaluate our proposed particle buffered stochastic gradient using stochastic gradient MCMC for inference on both long sequential synthetic and minute-resolution financial returns data, demonstrating the importance of this class of methods.
4 code implementations • 29 Sep 2018 • Toby Dylan Hocking, Guillem Rigaill, Paul Fearnhead, Guillaume Bourque
We describe a new algorithm and R package for peak detection in genomic data sets using constrained changepoint algorithms.
Computation
1 code implementation • NeurIPS 2018 • Jack Baker, Paul Fearnhead, Emily B. Fox, Christopher Nemeth
Unfortunately, many popular large-scale Bayesian models, such as network or topic models, require inference on sparse simplex spaces.
1 code implementation • 5 Jun 2018 • Alexander T. M. Fisch, Idris A. Eckley, Paul Fearnhead
Theoretical results establish the consistency of CAPA at detecting collective anomalies and, as a by-product, the consistency of a popular penalised cost based change in mean and variance detection method.
1 code implementation • 21 Feb 2018 • Sean Jewell, Toby Dylan Hocking, Paul Fearnhead, Daniela Witten
Calcium imaging data promises to transform the field of neuroscience by making it possible to record from large populations of neurons simultaneously.
Methodology Neurons and Cognition Applications
1 code implementation • 2 Oct 2017 • Jack Baker, Paul Fearnhead, Emily B. Fox, Christopher Nemeth
To do this, the package uses the software library TensorFlow, which has a variety of statistical distributions and mathematical operations as standard, meaning a wide class of models can be built using this framework.
1 code implementation • 16 Jun 2017 • Jack Baker, Paul Fearnhead, Emily B. Fox, Christopher Nemeth
These methods use a noisy estimate of the gradient of the log posterior, which reduces the per iteration computational cost of the algorithm.
7 code implementations • 9 Mar 2017 • Toby Dylan Hocking, Guillem Rigaill, Paul Fearnhead, Guillaume Bourque
This leads to a new algorithm which can solve problems with arbitrary affine constraints on adjacent segment means, and which has empirical time complexity that is log-linear in the amount of data.
4 code implementations • 16 Jan 2017 • Joris Bierkens, Alexandre Bouchard-Côté, Arnaud Doucet, Andrew B. Duncan, Paul Fearnhead, Thibaut Lienart, Gareth Roberts, Sebastian J. Vollmer
Piecewise Deterministic Monte Carlo algorithms enable simulation from a posterior distribution, whilst only needing to access a sub-sample of data at each iteration.
Methodology Computation
no code implementations • 6 Jan 2017 • Robert Maidstone, Paul Fearnhead, Adam Letchford
We define best based on a criterion that measures fit to data using the residual sum of squares, but penalises complexity based on an $L_0$ penalty on changes in slope.
no code implementations • 23 Nov 2016 • Paul Fearnhead, Joris Bierkens, Murray Pollock, Gareth O. Roberts
Recently there have been exciting developments in Monte Carlo methods, with the development of new MCMC and sequential Monte Carlo (SMC) algorithms which are based on continuous-time, rather than discrete-time, Markov processes.
1 code implementation • 23 Sep 2016 • Paul Fearnhead, Guillem Rigaill
We present an approach to changepoint detection that is robust to the presence of outliers.
no code implementations • 20 Jul 2016 • James Edwards, Paul Fearnhead, Kevin Glazebrook
We study its use in a class of exponential family MABs and identify weaknesses, including a propensity to take actions which are dominated with respect to both exploitation and exploration.
6 code implementations • 11 Jul 2016 • Joris Bierkens, Paul Fearnhead, Gareth Roberts
Standard MCMC methods can scale poorly to big data settings due to the need to evaluate the likelihood at each iteration.
Computation Probability 65C60, 65C05, 62F15, 60J25
no code implementations • 23 Dec 2014 • Christopher Nemeth, Chris Sherlock, Paul Fearnhead
This paper proposes a new sampling scheme based on Langevin dynamics that is applicable within pseudo-marginal and particle Markov chain Monte Carlo algorithms.
1 code implementation • 11 Dec 2014 • Kaylea Haynes, Idris A. Eckley, Paul Fearnhead
The computational complexity of this approach can be linear in the number of data points and linear in the difference between the number of changepoints in the optimal segmentations for the smallest and largest penalty values.
no code implementations • 29 Aug 2014 • Paul Fearnhead, Loukia Meligkotsidou
We then use the MCMC moves to update the latent variables, and the particle filter to propose new values for the parameters and stochastic process given the latent variables.
no code implementations • 4 Feb 2014 • Chris Nemeth, Paul Fearnhead
Currently the default is to use random walk Metropolis to update the parameter values.
no code implementations • 4 Jun 2013 • Christopher Nemeth, Paul Fearnhead, Lyudmila Mihaylova
This paper introduces an alternative approach for estimating these terms at a computational cost that is linear in the number of particles.
1 code implementation • 16 Oct 2009 • Paul Fearnhead, Zhen Liu
We consider Bayesian analysis of a class of multiple changepoint models.