Word Sense Disambiguation

92 papers with code • 13 benchmarks • 16 datasets

The task of Word Sense Disambiguation (WSD) consists of associating words in context with their most suitable entry in a pre-defined sense inventory. The de-facto sense inventory for English in WSD is WordNet. For example, given the word “mouse” and the following sentence:

“A mouse consists of an object held in one's hand, with one or more buttons.”

we would assign “mouse” with its electronic device sense (the 4th sense in the WordNet sense inventory).

Greatest papers with code

DeBERTa: Decoding-enhanced BERT with Disentangled Attention

huggingface/transformers ICLR 2021

Recent progress in pre-trained neural language models has significantly improved the performance of many natural language processing (NLP) tasks.

Common Sense Reasoning Coreference Resolution +11

FlauBERT: Unsupervised Language Model Pre-training for French

huggingface/transformers LREC 2020

Language models have become a key step to achieve state-of-the art results in many different Natural Language Processing (NLP) tasks.

Language Modelling Language understanding +3

Language Models are Few-Shot Learners

openai/gpt-3 NeurIPS 2020

By contrast, humans can generally perform a new language task from only a few examples or from simple instructions - something which current NLP systems still largely struggle to do.

Common Sense Reasoning Coreference Resolution +10

Why Self-Attention? A Targeted Evaluation of Neural Machine Translation Architectures

awslabs/sockeye EMNLP 2018

Recently, non-recurrent architectures (convolutional, self-attentional) have outperformed RNNs in neural machine translation.

Machine Translation Translation +1

Knowledge Enhanced Contextual Word Representations

allenai/kb IJCNLP 2019

Contextual word representations, typically trained on unstructured, unlabeled text, do not contain any explicit grounding to real world entities and are often unable to remember facts about those entities.

Entity Linking Entity Typing +3

Improved Word Representation Learning with Sememes

thunlp/SE-WRL ACL 2017

The key idea is to utilize word sememes to capture exact meanings of a word within specific contexts accurately.

Common Sense Reasoning Language Modelling +5

NSP-BERT: A Prompt-based Zero-Shot Learner Through an Original Pre-training Task--Next Sentence Prediction

sunyilgdx/NSP-BERT 8 Sep 2021

Using prompts to utilize language models to perform various downstream tasks, also known as prompt-based learning or prompt-learning, has lately gained significant success in comparison to the pre-train and fine-tune paradigm.

Entity Linking Language Modelling +1

Dict2vec : Learning Word Embeddings using Lexical Dictionaries

tca19/dict2vec EMNLP 2017

Learning word embeddings on large unlabeled corpus has been shown to be successful in improving many natural language tasks.

General Classification Knowledge Graphs +7