Gradient-based training of Gaussian Mixture Models in High-Dimensional Spaces

18 Dec 2019Alexander GepperthBenedikt Pfülb

We present an approach for efficiently training GMMs solely with Stochastic Gradient Descent (SGD) on huge amounts of non-stationary, high-dimensional data. In such scenarios, SGD is superior to the traditionally Expectation-Maximization (EM) algorithm w.r.t... (read more)

PDF Abstract


No code implementations yet. Submit your code now


Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper