A unified framework for hard and soft clustering with regularized optimal transport

In this paper, we formulate the problem of inferring a Finite Mixture Model from discrete data as an optimal transport problem with entropic regularization of parameter $\lambda\geq 0$. Our method unifies hard and soft clustering, the Expectation-Maximization (EM) algorithm being exactly recovered for $\lambda=1$. The family of clustering algorithm we propose rely on the resolution of nonconvex problems using alternating minimization. We study the convergence property of our generalized $\lambda-$EM algorithms and show that each step in the minimization process has a closed form solution when inferring finite mixture models of exponential families. Experiments highlight the benefits of taking a parameter $\lambda>1$ to improve the inference performance and $\lambda\to 0$ for classification.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here