All-but-the-Top: Simple and Effective Postprocessing for Word Representations

ICLR 2018 Jiaqi MuSuma BhatPramod Viswanath

Real-valued word representations have transformed NLP applications; popular examples are word2vec and GloVe, recognized for their ability to capture linguistic regularities. In this paper, we demonstrate a {\em very simple}, and yet counter-intuitive, postprocessing technique -- eliminate the common mean vector and a few top dominating directions from the word vectors -- that renders off-the-shelf representations {\em even stronger}... (read more)

PDF Abstract

Evaluation Results from the Paper


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK COMPARE
Sentiment Analysis MR GRU-RNN-WORD2VEC Accuracy 78.26 # 6
Sentiment Analysis SST-5 Fine-grained classification GRU-RNN-WORD2VEC Accuracy 45.02 # 16
Subjectivity Analysis SUBJ GRU-RNN-GLOVE Accuracy 91.85 # 8
Text Classification TREC-6 GRU-RNN-GLOVE Error 7 # 7