no code implementations • 14 Jan 2020 • Roei Schuster, Tal Schuster, Yoav Meri, Vitaly Shmatikov
Word embeddings, i. e., low-dimensional vector representations such as GloVe and SGNS, encode word "meaning" in the sense that distances between words' vectors correspond to their semantic proximity.