Stochastic First-Order Learning for Large-Scale Flexibly Tied Gaussian Mixture Model

11 Dec 2022  ·  Mohammad Pasande, Reshad Hosseini, Babak Nadjar Araabi ·

Gaussian Mixture Models (GMMs) are one of the most potent parametric density models used extensively in many applications. Flexibly-tied factorization of the covariance matrices in GMMs is a powerful approach for coping with the challenges of common GMMs when faced with high-dimensional data and complex densities which often demand a large number of Gaussian components. However, the expectation-maximization algorithm for fitting flexibly-tied GMMs still encounters difficulties with streaming and very large dimensional data. To overcome these challenges, this paper suggests the use of first-order stochastic optimization algorithms. Specifically, we propose a new stochastic optimization algorithm on the manifold of orthogonal matrices. Through numerous empirical results on both synthetic and real datasets, we observe that stochastic optimization methods can outperform the expectation-maximization algorithm in terms of attaining better likelihood, needing fewer epochs for convergence, and consuming less time per each epoch.

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here