Improving ClusterGAN Using Self-Augmented Information Maximization of Disentangling Latent Spaces

27 Jul 2021  ·  Tanmoy Dam, Sreenatha G. Anavatti, Hussein A. Abbass ·

Since their introduction in the last few years, conditional generative models have seen remarkable achievements. However, they often need the use of large amounts of labelled information. By using unsupervised conditional generation in conjunction with a clustering inference network, ClusterGAN has recently been able to achieve impressive clustering results. Since the real conditional distribution of data is ignored, the clustering inference network can only achieve inferior clustering performance by considering only uniform prior based generative samples. However, the true distribution is not necessarily balanced. Consequently, ClusterGAN fails to produce all modes, which results in sub-optimal clustering inference network performance. So, it is important to learn the prior, which tries to match the real distribution in an unsupervised way. In this paper, we propose self-augmentation information maximization improved ClusterGAN (SIMI-ClusterGAN) to learn the distinctive priors from the data directly. The proposed SIMI-ClusterGAN consists of four deep neural networks: self-augmentation prior network, generator, discriminator and clustering inference network. The proposed method has been validated using seven benchmark data sets and has shown improved performance over state-of-the art methods. To demonstrate the superiority of SIMI-ClusterGAN performance on imbalanced dataset, we have discussed two imbalanced conditions on MNIST datasets with one-class imbalance and three classes imbalanced cases. The results highlight the advantages of SIMI-ClusterGAN.

PDF Abstract
No code implementations yet. Submit your code now

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here