Word Mover's Embedding: From Word2Vec to Document Embedding

EMNLP 2018 Lingfei WuIan E. H. YenKun XuFangli XuAvinash BalakrishnanPin-Yu ChenPradeep RavikumarMichael J. Witbrock

While the celebrated Word2Vec technique yields semantically rich representations for individual words, there has been relatively less success in extending to generate unsupervised sentences or documents embeddings. Recent work has demonstrated that a distance measure between documents called \emph{Word Mover's Distance} (WMD) that aligns semantically similar words, yields unprecedented KNN classification accuracy... (read more)

PDF Abstract

Evaluation Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help community compare results to other papers.