no code implementations • 3 Oct 2022 • Dongqiang Yang, Ning li, Li Zou, Hongwei Ma
To improve vector space models in deriving semantic similarity, we post-process neural word embeddings through deep metric learning, through which we can inject lexical-semantic relations, including syn/antonymy and hypo/hypernymy, into a distributional space.
no code implementations • 30 Sep 2022 • Dongqiang Yang, Pikun Wang, Xiaodong Sun, Ning li
By comparing various Vector Space Models in detecting synonyms in TOEFL, we systematically study the salience of syntactic dependencies in accounting for distributional similarity.
no code implementations • 30 Sep 2022 • Dongqiang Yang, Yanqin Yin
Our findings suggest that without fine-tuning the uniform distance, taxonomic similarity measures can depend on the shortest path length as a prime factor to predict semantic similarity; in contrast to distributional semantics, edge-counting is free from sense distribution bias in use and can measure word similarity both literally and metaphorically; the synergy of retrofitting neural embeddings with concept relations in similarity prediction may indicate a new trend to leverage knowledge bases on transfer learning.