1 code implementation • 23 Mar 2024 • Omar Melikechi, Jeffrey W. Miller
This yields a tighter bound on E(FP), resulting in a feature selection criterion that has higher sensitivity in practice and is better calibrated in terms of matching the target E(FP).
no code implementations • 3 Nov 2023 • Jonathan H. Huggins, Jeffrey W. Miller
Under model misspecification, it is known that Bayesian posteriors often do not properly quantify uncertainty about true or pseudo-true parameters.
no code implementations • 5 Feb 2018 • Jeffrey W. Miller
It turns out that the full conditional distribution of the gamma shape parameter is well approximated by a gamma distribution, even for small sample sizes, when the prior on the shape parameter is also a gamma distribution.
no code implementations • 1 Jan 2018 • Jeffrey W. Miller
The Chinese restaurant process (CRP) and the stick-breaking process are the two most commonly used representations of the Dirichlet process.
1 code implementation • 22 Feb 2015 • Jeffrey W. Miller, Matthew T. Harrison
A natural Bayesian approach for mixture models with an unknown number of components is to take the usual finite mixture model with Dirichlet weights, and put a prior on the number of components---that is, to use a mixture of finite mixtures (MFM).
Methodology
no code implementations • NeurIPS 2013 • Jeffrey W. Miller, Matthew T. Harrison
For data assumed to come from a finite mixture with an unknown number of components, it has become common to use Dirichlet process mixtures (DPMs) not only for density estimation, but also for inferences about the number of components.
no code implementations • 30 Aug 2013 • Jeffrey W. Miller, Matthew T. Harrison
We show that this posterior is not consistent --- that is, on data from a finite mixture, it does not concentrate at the true number of components.