1 code implementation • Findings (ACL) 2021 • Pedro Colon-Hernandez, Yida Xin, Henry Lieberman, Catherine Havasi, Cynthia Breazeal, Peter Chin
Retrofitting is a technique used to move word vectors closer together or further apart in their space to reflect their relationships in a Knowledge Base (KB).
no code implementations • 28 Jan 2021 • Pedro Colon-Hernandez, Catherine Havasi, Jason Alonso, Matthew Huggins, Cynthia Breazeal
In recent years, transformer-based language models have achieved state of the art performance in various NLP benchmarks.
6 code implementations • 12 Dec 2016 • Robyn Speer, Joshua Chin, Catherine Havasi
It is designed to represent the general knowledge involved in understanding language, improving natural language applications by allowing the application to better understand the meanings behind the words people use.
no code implementations • LREC 2012 • Robyn Speer, Catherine Havasi
ConceptNet is a knowledge representation project, providing a large semantic graph that describes general human knowledge and how it is expressed in natural language.