Browse > Natural Language Processing > Word Sense Disambiguation

Word Sense Disambiguation

38 papers with code · Natural Language Processing

The task of Word Sense Disambiguation (WSD) consists of associating words in context with their most suitable entry in a pre-defined sense inventory. The de-facto sense inventory for English in WSD is WordNet. For example, given the word “mouse” and the following sentence:

“A mouse consists of an object held in one's hand, with one or more buttons.”

we would assign “mouse” with its electronic device sense (the 4th sense in the WordNet sense inventory).

Leaderboards

TREND DATASET BEST METHOD PAPER TITLE PAPER CODE COMPARE

Latest papers with code

Does BERT Make Any Sense? Interpretable Word Sense Disambiguation with Contextualized Embeddings

23 Sep 2019uhh-lt/bert-sense

Since vectors of the same word type can vary depending on the respective context, they implicitly provide a model for word sense disambiguation (WSD).

WORD SENSE DISAMBIGUATION

5
23 Sep 2019

Zero-shot Word Sense Disambiguation using Sense Definition Embeddings

ACL 2019 malllabiisc/EWISE

To overcome this challenge, we propose Extended WSD Incorporating Sense Embeddings (EWISE), a supervised model to perform WSD by predicting over a continuous sense embedding space as opposed to a discrete label space.

KNOWLEDGE GRAPH EMBEDDING WORD SENSE DISAMBIGUATION ZERO-SHOT LEARNING

36
01 Jul 2019

LIAAD at SemDeep-5 Challenge: Word-in-Context (WiC)

WS 2019 danlou/LMMS

This paper describes the LIAAD system that was ranked second place in the Word-in-Context challenge (WiC) featured in SemDeep-5.

WORD SENSE DISAMBIGUATION

16
24 Jun 2019

Language Modelling Makes Sense: Propagating Representations through WordNet for Full-Coverage Word Sense Disambiguation

ACL 2019 danlou/LMMS

Contextual embeddings represent a new generation of semantic representations learned from Neural Language Modelling (NLM) that addresses the issue of meaning conflation hampering traditional word embeddings.

LANGUAGE MODELLING WORD EMBEDDINGS WORD SENSE DISAMBIGUATION

16
24 Jun 2019

Making Fast Graph-based Algorithms with Graph Metric Embeddings

ACL 2019 uhh-lt/path2vec

The computation of distance measures between nodes in graphs is inefficient and does not scale to large graphs.

SEMANTIC TEXTUAL SIMILARITY WORD SENSE DISAMBIGUATION

14
17 Jun 2019

Sense Vocabulary Compression through the Semantic Knowledge of WordNet for Neural Word Sense Disambiguation

14 May 2019getalp/disambiguate

In this article, we tackle the issue of the limited quantity of manually sense annotated corpora for the task of word sense disambiguation, by exploiting the semantic relationships between senses such as synonymy, hypernymy and hyponymy, in order to compress the sense vocabulary of Princeton WordNet, and thus reduce the number of different sense tags that must be observed to disambiguate all words of the lexical database.

WORD SENSE DISAMBIGUATION

21
14 May 2019

Cross-lingual Lexical Sememe Prediction

EMNLP 2018 thunlp/CL-SP

We propose a novel framework to model correlations between sememes and multi-lingual words in low-dimensional semantic space for sememe prediction.

SENTIMENT ANALYSIS WORD SENSE DISAMBIGUATION

12
01 Oct 2018

xSense: Learning Sense-Separated Sparse Representations and Textual Definitions for Explainable Word Sense Networks

10 Sep 2018terarachang/xSense

This paper focuses on interpreting the embeddings for various aspects, including sense separation in the vector dimensions and definition generation.

WORD EMBEDDINGS WORD SENSE DISAMBIGUATION

0
10 Sep 2018