|TREND||DATASET||BEST METHOD||PAPER TITLE||PAPER||CODE||COMPARE|
This research proposes a new (old) metric for evaluating goodness of fit in topic models, the coefficient of determination, or $R^2$.
Supervised topic models are often sought to balance prediction quality and interpretability.
We present Bayesian subspace multinomial model (Bayesian SMM), a generative log-linear model that learns to represent documents in the form of Gaussian distributions, thereby encoding the uncertainty in its co-variance.
SOTA for Topic Models on 20NEWS
In this paper, we develop an online inference algorithm for topic models which leverages stochasticity to scale well in the number of documents, sparsity to scale well in the number of topics, and which operates in the collapsed representation of the topic model for improved accuracy and run-time performance.
Massive Open Online Courses are educational programs that are open and accessible to a large number of people through the internet.
We introduce a dynamic generative model, Bayesian allocation model (BAM), which establishes explicit connections between nonnegative tensor factorization (NTF), graphical models of discrete probability distributions and their Bayesian extensions, and the topic models such as the latent Dirichlet allocation.
Previous research showed that ratings of literariness are predictable from texts to a substantial extent using machine learning, suggesting that it may be possible to explain the consensus among readers on which novels are literary as a consensus on the kind of writing style that characterizes literature.