Improved Word Sense Disambiguation Using Pre-Trained Contextualized Word Representations

Contextualized word representations are able to give different representations for the same word in different contexts, and they have been shown to be effective in downstream natural language processing tasks, such as question answering, named entity recognition, and sentiment analysis. However, evaluation on word sense disambiguation (WSD) in prior work shows that using contextualized word representations does not outperform the state-of-the-art approach that makes use of non-contextualized word embeddings. In this paper, we explore different strategies of integrating pre-trained contextualized word representations and our best strategy achieves accuracies exceeding the best prior published accuracies by significant margins on multiple benchmark WSD datasets. We make the source code available at https://github.com/nusnlp/contextemb-wsd.

PDF Abstract IJCNLP 2019 PDF IJCNLP 2019 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Word Sense Disambiguation Supervised: BERT (linear projection) Senseval 2 75.5 # 14
Senseval 3 73.6 # 14
SemEval 2007 68.1 # 14
SemEval 2013 71.1 # 15
SemEval 2015 76.2 # 14
Word Sense Disambiguation Supervised: BERT (nearest neighbour) Senseval 2 73.8 # 16
Senseval 3 71.6 # 15
SemEval 2007 63.3 # 16
SemEval 2013 69.2 # 16
SemEval 2015 74.4 # 16

Methods


No methods listed for this paper. Add relevant methods here