On Hölder projective divergences

14 Jan 2017  ·  Frank Nielsen, Ke Sun, Stéphane Marchand-Maillet ·

We describe a framework to build distances by measuring the tightness of inequalities, and introduce the notion of proper statistical divergences and improper pseudo-divergences. We then consider the H\"older ordinary and reverse inequalities, and present two novel classes of H\"older divergences and pseudo-divergences that both encapsulate the special case of the Cauchy-Schwarz divergence. We report closed-form formulas for those statistical dissimilarities when considering distributions belonging to the same exponential family provided that the natural parameter space is a cone (e.g., multivariate Gaussians), or affine (e.g., categorical distributions). Those new classes of H\"older distances are invariant to rescaling, and thus do not require distributions to be normalized. Finally, we show how to compute statistical H\"older centroids with respect to those divergences, and carry out center-based clustering toy experiments on a set of Gaussian distributions that demonstrate empirically that symmetrized H\"older divergences outperform the symmetric Cauchy-Schwarz divergence.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here