Discovering Discrete Latent Topics with Neural Variational Inference

ICML 2017  ·  Yishu Miao, Edward Grefenstette, Phil Blunsom ·

Topic models have been widely explored as probabilistic generative models of documents. Traditional inference methods have sought closed-form derivations for updating the models, however as the expressiveness of these models grows, so does the difficulty of performing fast and accurate inference over their parameters. This paper presents alternative neural approaches to topic modelling by providing parameterisable distributions over topics which permit training by backpropagation in the framework of neural variational inference. In addition, with the help of a stick-breaking construction, we propose a recurrent network that is able to discover a notionally unbounded number of topics, analogous to Bayesian non-parametric topic models. Experimental results on the MXM Song Lyrics, 20NewsGroups and Reuters News datasets demonstrate the effectiveness and efficiency of these neural topic models.

PDF Abstract ICML 2017 PDF ICML 2017 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Topic Models 20NewsGroups GSM C_v 0.55 # 2
Topic Models AG News GSM C_v 0.41 # 3
NPMI 0.03 # 2

Methods


No methods listed for this paper. Add relevant methods here