Words are Vectors, Dependencies are Matrices: Learning Word Embeddings from Dependency Graphs

WS 2019 Paula CzarnowskaGuy EmersonAnn Copestake

Distributional Semantic Models (DSMs) construct vector representations of word meanings based on their contexts. Typically, the contexts of a word are defined as its closest neighbours, but they can also be retrieved from its syntactic dependency relations... (read more)

PDF Abstract


No code implementations yet. Submit your code now

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.