About

The task of Word Sense Disambiguation (WSD) consists of associating words in context with their most suitable entry in a pre-defined sense inventory. The de-facto sense inventory for English in WSD is WordNet. For example, given the word “mouse” and the following sentence:

“A mouse consists of an object held in one's hand, with one or more buttons.”

we would assign “mouse” with its electronic device sense (the 4th sense in the WordNet sense inventory).

Benchmarks

TREND DATASET BEST METHOD PAPER TITLE PAPER CODE COMPARE

Subtasks

Datasets

Greatest papers with code

FlauBERT: Unsupervised Language Model Pre-training for French

LREC 2020 huggingface/transformers

Language models have become a key step to achieve state-of-the art results in many different Natural Language Processing (NLP) tasks.

LANGUAGE MODELLING NATURAL LANGUAGE INFERENCE TEXT CLASSIFICATION WORD SENSE DISAMBIGUATION

Why Self-Attention? A Targeted Evaluation of Neural Machine Translation Architectures

EMNLP 2018 awslabs/sockeye

Recently, non-recurrent architectures (convolutional, self-attentional) have outperformed RNNs in neural machine translation.

MACHINE TRANSLATION WORD SENSE DISAMBIGUATION

Distributed Word Representation in Tsetlin Machine

14 Apr 2021cair/TsetlinMachine

This restriction has constrained the performance of TM compared to deep neural networks (DNNs) in NLP.

SENTIMENT ANALYSIS TEXT CLASSIFICATION WORD SENSE DISAMBIGUATION

Knowledge Enhanced Contextual Word Representations

IJCNLP 2019 allenai/kb

Contextual word representations, typically trained on unstructured, unlabeled text, do not contain any explicit grounding to real world entities and are often unable to remember facts about those entities.

Ranked #6 on Relation Extraction on TACRED (using extra training data)

ENTITY LINKING ENTITY TYPING LANGUAGE MODELLING RELATION EXTRACTION WORD SENSE DISAMBIGUATION

Zero-shot Word Sense Disambiguation using Sense Definition Embeddings

ACL 2019 malllabiisc/EWISE

To overcome this challenge, we propose Extended WSD Incorporating Sense Embeddings (EWISE), a supervised model to perform WSD by predicting over a continuous sense embedding space as opposed to a discrete label space.

GENERALIZED ZERO-SHOT LEARNING KNOWLEDGE GRAPH EMBEDDING WORD SENSE DISAMBIGUATION

Incorporating Glosses into Neural Word Sense Disambiguation

ACL 2018 jimiyulu/WSD_MemNN

GAS models the semantic relationship between the context and the gloss in an improved memory network framework, which breaks the barriers of the previous supervised methods and knowledge-based methods.

WORD SENSE DISAMBIGUATION