Word Sense Induction

19 papers with code • 1 benchmarks • 1 datasets

Word sense induction (WSI) is widely known as the “unsupervised version” of WSD. The problem states as: Given a target word (e.g., “cold”) and a collection of sentences (e.g., “I caught a cold”, “The weather is cold”) that use the word, cluster the sentences according to their different senses/meanings. We do not need to know the sense/meaning of each cluster, but sentences inside a cluster should have used the target words with the same sense.

Description from NLP Progress

Datasets


Latest papers with no code

Combining Lexical Substitutes in Neural Word Sense Induction

no code yet • RANLP 2019

Word Sense Induction (WSI) is the task of grouping of occurrences of an ambiguous word according to their meaning.

Using Wiktionary as a resource for WSD : the case of French verbs

no code yet • WS 2019

In this paper, we investigate which strategy to adopt to achieve WSD for languages lacking data that was annotated specifically for the task, focusing on the particular case of verb disambiguation in French.

Vector representations of text data in deep learning

no code yet • 7 Jan 2019

For document-level representations we propose Binary Paragraph Vector: a neural network models for learning binary representations of text documents, which can be used for fast document retrieval.

Word Sense Induction using Knowledge Embeddings

no code yet • 23 Oct 2018

By grounding them to knowledge bases they are able to learn multi-word representations and are also interpretable.

Disambiguated skip-gram model

no code yet • EMNLP 2018

This allows us to control the granularity of representations learned by our model.

How much does a word weigh? Weighting word embeddings for word sense induction

no code yet • 23 May 2018

The paper describes our participation in the first shared task on word sense induction and disambiguation for the Russian language RUSSE'2018 (Panchenko et al., 2018).

Leveraging Lexical Substitutes for Unsupervised Word Sense Induction

no code yet • Thirty-Second AAAI Conference on Artificial Intelligence 2018

Word sense induction is the most prominent unsupervised approach to lexical disambiguation.

Efficient Graph-based Word Sense Induction by Distributional Inclusion Vector Embeddings

no code yet • WS 2018

Word sense induction (WSI), which addresses polysemy by unsupervised discovery of multiple word senses, resolves ambiguities for downstream NLP tasks and also makes word representations more interpretable.