On $w$-mixtures: Finite convex combinations of prescribed component distributions

2 Aug 2017  ·  Frank Nielsen, Richard Nock ·

We consider the space of $w$-mixtures which is defined as the set of finite statistical mixtures sharing the same prescribed component distributions closed under convex combinations. The information geometry induced by the Bregman generator set to the Shannon negentropy on this space yields a dually flat space called the mixture family manifold. We show how the Kullback-Leibler (KL) divergence can be recovered from the corresponding Bregman divergence for the negentropy generator: That is, the KL divergence between two $w$-mixtures amounts to a Bregman Divergence (BD) induced by the Shannon negentropy generator. Thus the KL divergence between two Gaussian Mixture Models (GMMs) sharing the same Gaussian components is equivalent to a Bregman divergence. This KL-BD equivalence on a mixture family manifold implies that we can perform optimal KL-averaging aggregation of $w$-mixtures without information loss. More generally, we prove that the statistical skew Jensen-Shannon divergence between $w$-mixtures is equivalent to a skew Jensen divergence between their corresponding parameters. Finally, we state several properties, divergence identities, and inequalities relating to $w$-mixtures.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here