no code implementations • COLING 2022 • Sapan Shah, Sreedhar Reddy, Pushpak Bhattacharyya
The retrofitted embeddings achieve better inter-cluster and intra-cluster distance for words having the same emotions, e. g., the joy cluster containing words like fun, happiness, etc., and the anger cluster with words like offence, rage, etc., as evaluated through different cluster quality metrics.
no code implementations • 27 Feb 2024 • Avadhut Sardeshmukh, Sreedhar Reddy, BP Gautham, Pushpak Bhattacharyya
The resultant model can be used for both forward and inverse prediction i. e., for predicting the properties of a given microstructure as well as for predicting the microstructure required to obtain given properties.
no code implementations • 29 Oct 2023 • Sapan Shah, Sreedhar Reddy, Pushpak Bhattacharyya
We present a novel retrofitting method to induce emotion aspects into pre-trained language models (PLMs) such as BERT and RoBERTa.
no code implementations • COLING 2020 • Sapan Shah, Sreedhar Reddy, Pushpak Bhattacharyya
We present a novel retrofitting model that can leverage relational knowledge available in a knowledge resource to improve word embeddings.