On $f$-divergences between Cauchy distributions

29 Jan 2021  ·  Frank Nielsen, Kazuki Okamura ·

We prove that the $f$-divergences between univariate Cauchy distributions are all symmetric, and can be expressed as strictly increasing scalar functions of the symmetric chi-squared divergence. We report the corresponding scalar functions for the total variation distance, the Kullback-Leibler divergence, the squared Hellinger divergence, and the Jensen-Shannon divergence among others. Next, we give conditions to expand the $f$-divergences as converging infinite series of higher-order power chi divergences, and illustrate the criterion for converging Taylor series expressing the $f$-divergences between Cauchy distributions. We then show that the symmetric property of $f$-divergences holds for multivariate location-scale families with prescribed matrix scales provided that the standard density is even which includes the cases of the multivariate normal and Cauchy families. However, the $f$-divergences between multivariate Cauchy densities with different scale matrices are shown asymmetric. Finally, we present several metrizations of $f$-divergences between univariate Cauchy distributions and further report geometric embedding properties of the Kullback-Leibler divergence.

PDF Abstract
No code implementations yet. Submit your code now

Categories


Information Theory Information Theory Statistics Theory Statistics Theory

Datasets


  Add Datasets introduced or used in this paper