Browse > Natural Language Processing > Word Sense Disambiguation

Word Sense Disambiguation

39 papers with code · Natural Language Processing

The task of Word Sense Disambiguation (WSD) consists of associating words in context with their most suitable entry in a pre-defined sense inventory. The de-facto sense inventory for English in WSD is WordNet. For example, given the word “mouse” and the following sentence:

“A mouse consists of an object held in one's hand, with one or more buttons.”

we would assign “mouse” with its electronic device sense (the 4th sense in the WordNet sense inventory).

Leaderboards

TREND DATASET BEST METHOD PAPER TITLE PAPER CODE COMPARE

Latest papers with code

FlauBERT: Unsupervised Language Model Pre-training for French

11 Dec 2019getalp/Flaubert

Language models have become a key step to achieve state-of-the art results in many different Natural Language Processing (NLP) tasks.

LANGUAGE MODELLING NATURAL LANGUAGE INFERENCE TEXT CLASSIFICATION WORD SENSE DISAMBIGUATION

81
11 Dec 2019

Word-Class Embeddings for Multiclass Text Classification

26 Nov 2019AlexMoreo/word-class-embeddings

Pre-trained word embeddings encode general word semantics and lexical regularities of natural language, and have proven useful across many NLP tasks, including word sense disambiguation, machine translation, and sentiment analysis, to name a few.

MACHINE TRANSLATION SENTIMENT ANALYSIS TEXT CLASSIFICATION WORD EMBEDDINGS WORD SENSE DISAMBIGUATION

3
26 Nov 2019

Improved Word Sense Disambiguation Using Pre-Trained Contextualized Word Representations

IJCNLP 2019 nusnlp/contextemb-wsd

Contextualized word representations are able to give different representations for the same word in different contexts, and they have been shown to be effective in downstream natural language processing tasks, such as question answering, named entity recognition, and sentiment analysis.

NAMED ENTITY RECOGNITION QUESTION ANSWERING SENTIMENT ANALYSIS WORD EMBEDDINGS WORD SENSE DISAMBIGUATION

4
01 Oct 2019

Does BERT Make Any Sense? Interpretable Word Sense Disambiguation with Contextualized Embeddings

23 Sep 2019uhh-lt/bert-sense

Since vectors of the same word type can vary depending on the respective context, they implicitly provide a model for word sense disambiguation (WSD).

WORD SENSE DISAMBIGUATION

21
23 Sep 2019

Zero-shot Word Sense Disambiguation using Sense Definition Embeddings

ACL 2019 malllabiisc/EWISE

To overcome this challenge, we propose Extended WSD Incorporating Sense Embeddings (EWISE), a supervised model to perform WSD by predicting over a continuous sense embedding space as opposed to a discrete label space.

KNOWLEDGE GRAPH EMBEDDING WORD SENSE DISAMBIGUATION ZERO-SHOT LEARNING

43
01 Jul 2019

LIAAD at SemDeep-5 Challenge: Word-in-Context (WiC)

WS 2019 danlou/LMMS

This paper describes the LIAAD system that was ranked second place in the Word-in-Context challenge (WiC) featured in SemDeep-5.

WORD SENSE DISAMBIGUATION

28
24 Jun 2019

Language Modelling Makes Sense: Propagating Representations through WordNet for Full-Coverage Word Sense Disambiguation

ACL 2019 danlou/LMMS

Contextual embeddings represent a new generation of semantic representations learned from Neural Language Modelling (NLM) that addresses the issue of meaning conflation hampering traditional word embeddings.

LANGUAGE MODELLING WORD EMBEDDINGS WORD SENSE DISAMBIGUATION

28
24 Jun 2019

Making Fast Graph-based Algorithms with Graph Metric Embeddings

ACL 2019 uhh-lt/path2vec

The computation of distance measures between nodes in graphs is inefficient and does not scale to large graphs.

SEMANTIC SIMILARITY SEMANTIC TEXTUAL SIMILARITY WORD SENSE DISAMBIGUATION

15
17 Jun 2019