no code implementations • 17 Aug 2019 • Creighton Heaukulani, Daniel M. Roy
We develop constructions for exchangeable sequences of point processes that are rendered conditionally-i. i. d.
1 code implementation • NeurIPS 2019 • Creighton Heaukulani, Mark van der Wilk
We implement gradient-based variational inference routines for Wishart and inverse Wishart processes, which we apply as Bayesian models for the dynamic, heteroskedastic covariance matrix of a multivariate time series.
no code implementations • 11 May 2019 • Onno Kampman, Creighton Heaukulani
We consider the probabilistic analogue to neural network matrix factorization (Dziugaite & Roy, 2015), which we construct with Bayesian neural networks and fit with variational inference.
no code implementations • ICML 2017 • Juho Lee, Creighton Heaukulani, Zoubin Ghahramani, Lancelot F. James, Seungjin Choi
The BFRY random variables are well approximated by gamma random variables in a variational Bayesian inference routine, which we apply to several network datasets for which power law degree distributions are a natural assumption.
no code implementations • 8 Dec 2015 • Creighton Heaukulani, Daniel M. Roy
We investigate a class of feature allocation models that generalize the Indian buffet process and are parameterized by Gibbs-type random measures.
no code implementations • 14 Aug 2014 • Creighton Heaukulani, David A. Knowles, Zoubin Ghahramani
We define the beta diffusion tree, a random tree structure with a set of leaves that defines a collection of overlapping subsets of objects, known as a feature allocation.
no code implementations • 31 Dec 2013 • Creighton Heaukulani, Daniel M. Roy
sequences of Bernoulli processes with a common beta process base measure, in which case the combinatorial structure is described by the Indian buffet process.