Rapid parametric density estimation

7 Feb 2017  ·  Jarek Duda ·

Parametric density estimation, for example as Gaussian distribution, is the base of the field of statistics. Machine learning requires inexpensive estimation of much more complex densities, and the basic approach is relatively costly maximum likelihood estimation (MLE). There will be discussed inexpensive density estimation, for example literally fitting a polynomial (or Fourier series) to the sample, which coefficients are calculated by just averaging monomials (or sine/cosine) over the sample. Another discussed basic application is fitting distortion to some standard distribution like Gaussian - analogously to ICA, but additionally allowing to reconstruct the disturbed density. Finally, by using weighted average, it can be also applied for estimation of non-probabilistic densities, like modelling mass distribution, or for various clustering problems by using negative (or complex) weights: fitting a function which sign (or argument) determines clusters. The estimated parameters are approaching the optimal values with error dropping like $1/\sqrt{n}$, where $n$ is the sample size.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods