Cramér-Rao Lower Bounds Arising from Generalized Csiszár Divergences

14 Jan 2020  ·  M. Ashok Kumar, Kumar Vijay Mishra ·

We study the geometry of probability distributions with respect to a generalized family of Csisz\'ar $f$-divergences. A member of this family is the relative $\alpha$-entropy which is also a R\'enyi analog of relative entropy in information theory and known as logarithmic or projective power divergence in statistics. We apply Eguchi's theory to derive the Fisher information metric and the dual affine connections arising from these generalized divergence functions. This enables us to arrive at a more widely applicable version of the Cram\'{e}r-Rao inequality, which provides a lower bound for the variance of an estimator for an escort of the underlying parametric probability distribution. We then extend the Amari-Nagaoka's dually flat structure of the exponential and mixer models to other distributions with respect to the aforementioned generalized metric. We show that these formulations lead us to find unbiased and efficient estimators for the escort model. Finally, we compare our work with prior results on generalized Cram\'er-Rao inequalities that were derived from non-information-geometric frameworks.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here