Learning Word Embeddings without Context Vectors

WS 2019  ·  Alexey Zobnin, Evgenia Elistratova ·

Most word embedding algorithms such as word2vec or fastText construct two sort of vectors: for words and for contexts. Naive use of vectors of only one sort leads to poor results. We suggest using indefinite inner product in skip-gram negative sampling algorithm. This allows us to use only one sort of vectors without loss of quality. Our {``}context-free{''} cf algorithm performs on par with SGNS on word similarity datasets

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods