Exponential Concentration of a Density Functional Estimator

NeurIPS 2014  ·  Shashank Singh, Barnabás P óczos ·

We analyze a plug-in estimator for a large class of integral functionals of one or more continuous probability densities. This class includes important families of entropy, divergence, mutual information, and their conditional versions. For densities on the $d$-dimensional unit cube $[0,1]^d$ that lie in a $\beta$-H\"older smoothness class, we prove our estimator converges at the rate $O \left( n^{-\frac{\beta}{\beta + d}} \right)$. Furthermore, we prove the estimator is exponentially concentrated about its mean, whereas most previous related results have proven only expected error bounds on estimators.

PDF Abstract NeurIPS 2014 PDF NeurIPS 2014 Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here