Topic Models
210 papers with code • 6 benchmarks • 12 datasets
A topic model is a type of statistical model for discovering the abstract "topics" that occur in a collection of documents. Topic modeling is a frequently used text-mining tool for the discovery of hidden semantic structures in a text body.
Libraries
Use these libraries to find Topic Models models and implementationsDatasets
Latest papers with no code
Graph2topic: an opensource topic modeling framework based on sentence embedding and community detection
However, these approaches suffer from the inability to select appropriate parameters and incomplete models that overlook the quantitative relation between words with topics and topics with text.
A User-Centered, Interactive, Human-in-the-Loop Topic Modelling System
Human-in-the-loop topic modelling incorporates users' knowledge into the modelling process, enabling them to refine the model iteratively.
Topics in the Haystack: Extracting and Evaluating Topics beyond Coherence
We demonstrate the competitive performance of our method with a large benchmark study, and achieve superior results compared to state-of-the-art topic modeling and document clustering models.
Federated Variational Inference Methods for Structured Latent Variable Models
Federated learning methods enable model training across distributed data sources without data leaving their original locations and have gained increasing interest in various fields.
You Are What You Talk About: Inducing Evaluative Topics for Personality Analysis
Expressing attitude or stance toward entities and concepts is an integral part of human behavior and personality.
Improving the Inference of Topic Models via Infinite Latent State Replications
In text mining, topic models are a type of probabilistic generative models for inferring latent semantic topics from text corpus.
Interpretable and Scalable Graphical Models for Complex Spatio-temporal Processes
Fourth, it proposes a modular and interpretable framework for unsupervised and weakly-supervised probabilistic topic modeling of time-varying data that combines generative statistical models with computational geometric methods.
Topics in Contextualised Attention Embeddings
Contextualised word vectors obtained via pre-trained language models encode a variety of knowledge that has already been exploited in applications.
Topics as Entity Clusters: Entity-based Topics from Language Models and Graph Neural Networks
We demonstrate that our approach consistently outperforms other state-of-the-art topic models across coherency metrics and find that the explicit knowledge encoded in the graph-based embeddings provides more coherent topics than the implicit knowledge encoded with the contextualized embeddings of language models.
Using Open-Ended Stressor Responses to Predict Depressive Symptoms across Demographics
Stressors are related to depression, but this relationship is complex.