no code implementations • NeurIPS 2007 • Alexandre Bouchard-Côté, Percy S. Liang, Dan Klein, Thomas L. Griffiths
We present a probabilistic approach to language change in which word forms are represented by phoneme sequences that undergo stochastic edits along the branches of a phylogenetic tree.
no code implementations • NeurIPS 2008 • Alexandre Bouchard-Côté, Dan Klein, Michael. I. Jordan
Accurate and efficient inference in evolutionary trees is a central problem in computational biology.
no code implementations • NeurIPS 2009 • Alexandre Bouchard-Côté, Slav Petrov, Dan Klein
Pruning can massively accelerate the computation of feature expectations in large models.
no code implementations • NeurIPS 2010 • Alexandre Bouchard-Côté, Michael. I. Jordan
Since the discovery of sophisticated fully polynomial randomized algorithms for a range of #P problems (Karzanov et al., 1991; Jerrum et al., 2001; Wilson, 2004), theoretical work on approximate inference in combinatorial spaces has focused on Markov chain Monte Carlo methods.
no code implementations • NeurIPS 2012 • Bonnie Kirkpatrick, Alexandre Bouchard-Côté
Meanwhile, analysis methods have remained limited to pedigrees of <100 individuals which limits analyses to many small independent pedigrees.
no code implementations • NeurIPS 2012 • Seong-Hwan Jun, Liangliang Wang, Alexandre Bouchard-Côté
We propose a novel method for scalable parallelization of SMC algorithms, Entangled Monte Carlo simulation (EMC).
no code implementations • 18 Jun 2014 • Bobak Shahriari, Ziyu Wang, Matthew W. Hoffman, Alexandre Bouchard-Côté, Nando de Freitas
How- ever, the performance of a Bayesian optimization method very much depends on its exploration strategy, i. e. the choice of acquisition function, and it is not clear a priori which choice will result in superior performance.
3 code implementations • 19 Jun 2014 • Fredrik Lindsten, Adam M. Johansen, Christian A. Naesseth, Bonnie Kirkpatrick, Thomas B. Schön, John Aston, Alexandre Bouchard-Côté
We propose a novel class of Sequential Monte Carlo (SMC) algorithms, appropriate for inference in probabilistic graphical models.
no code implementations • 14 Aug 2015 • Bobak Shahriari, Alexandre Bouchard-Côté, Nando de Freitas
Bayesian optimization has recently emerged as a popular and efficient tool for global optimization and hyperparameter tuning.
3 code implementations • 8 Oct 2015 • Alexandre Bouchard-Côté, Sebastian J. Vollmer, Arnaud Doucet
We explore and propose several original extensions of an alternative approach introduced recently in Peters and de With (2012) where the target distribution of interest is explored using a continuous-time Markov process.
Methodology Statistics Theory Statistics Theory
4 code implementations • 16 Jan 2017 • Joris Bierkens, Alexandre Bouchard-Côté, Arnaud Doucet, Andrew B. Duncan, Paul Fearnhead, Thibaut Lienart, Gareth Roberts, Sebastian J. Vollmer
Piecewise Deterministic Monte Carlo algorithms enable simulation from a posterior distribution, whilst only needing to access a sub-sample of data at each iteration.
Methodology Computation
1 code implementation • 28 Jan 2019 • Robert Cornish, Paul Vanetti, Alexandre Bouchard-Côté, George Deligiannidis, Arnaud Doucet
Bayesian inference via standard Markov Chain Monte Carlo (MCMC) methods is too computationally intensive to handle large datasets, since the cost per step usually scales like $\Theta(n)$ in the number of data points $n$.
no code implementations • 30 May 2019 • Tingting Zhao, Alexandre Bouchard-Côté
An important aspect of the practical implementation of BPS is the simulation of event times.
1 code implementation • 22 Dec 2019 • Alexandre Bouchard-Côté, Kevin Chern, Davor Cubranic, Sahand Hosseini, Justin Hume, Matteo Lepur, Zihui Ouyang, Giorgio Sgarbi
Blang allows users to perform Bayesian analysis on arbitrary data types while using a declarative syntax similar to BUGS.
1 code implementation • 25 Jan 2020 • Alexandre Bouchard-Côté, Andrew Roth
To overcome this problem we have developed a Gibbs sampler that can update an entire row of the feature allocation matrix in a single move.
no code implementations • 24 Jun 2020 • Peiyuan Zhu, Alexandre Bouchard-Côté, Trevor Campbell
Completely random measures provide a principled approach to creating flexible unsupervised models, where the number of latent features is infinite and the number of features that influence the data grows with the size of the data set.
1 code implementation • 15 Feb 2021 • Saifuddin Syed, Vittorio Romaniello, Trevor Campbell, Alexandre Bouchard-Côté
Parallel tempering (PT) is a class of Markov chain Monte Carlo algorithms that constructs a path of distributions annealing between a tractable reference and an intractable target, and then interchanges states along the path to improve mixing in the target.
Computation 65C05
no code implementations • 14 Feb 2024 • Alexandre Bouchard-Côté, Trevor Campbell, Geoff Pleiss, Nikola Surjanovic
This paper is intended to appear as a chapter for the Handbook of Markov Chain Monte Carlo.