Mixture model fitting using conditional models and modal Gibbs sampling

27 Dec 2017  ·  Virgilio Gomez-Rubio ·

In this paper, we present a novel approach to fitting mixture models based on estimating first the posterior distribution of the auxiliary variables that assign each observation to a group in the mixture. The posterior distributions of the remainder of the parameters in the mixture is obtained by averaging over their conditional posterior marginals on the auxiliary variables using Bayesian model averaging. A new algorithm based on Gibbs sampling is used to approximate the posterior distribution of the auxiliary variables without sampling any other parameter in the model. In particular, the modes of the full conditionals of the parameters of the densities in the mixture are computed and these are plugged-in to the full conditional of the auxiliary variables to draw samples. This approximation, that we have called 'modal' Gibbs sampling, reduces the computational burden in the Gibbs sampling algorithm and still provides very good estimates of the posterior distribution of the auxiliary variables. Conditional models on the auxiliary variables are fitted using the Integrated Nested Laplace Approximation (INLA) to obtain the conditional posterior distributions, including modes, of the distributional parameters in the mixtures. This approach is general enough to consider mixture models with discrete or continuous outcomes from a wide range of distributions and latent models as conditional model fitting is done with INLA. This presents several other advantages, such as fast fitting of the conditional models, not being restricted to the use of conjugate priors on the model parameters and being less prone to label switching. Within this framework, computing the marginal likelihood of the mixture model when the number of groups in the mixture is known is easy and it can be used to tackle selection of the number of components.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper