1 code implementation • EMNLP (ACL) 2021 • Fernando Alva-Manchego, Abiola Obamuyide, Amit Gajbhiye, Frédéric Blain, Marina Fomicheva, Lucia Specia
We introduce deepQuest-py, a framework for training and evaluation of large and light-weight models for Quality Estimation (QE).
no code implementations • 23 Oct 2023 • Amit Gajbhiye, Zied Bouraoui, Na Li, Usashi Chatterjee, Luis Espinosa Anke, Steven Schockaert
We show that by augmenting the label set with shared properties, we can improve the performance of the state-of-the-art models for this task.
no code implementations • 9 Oct 2023 • Usashi Chatterjee, Amit Gajbhiye, Steven Schockaert
The theory of Conceptual Spaces is an influential cognitive-linguistic framework for representing the meaning of concepts.
1 code implementation • COLING 2022 • Amit Gajbhiye, Luis Espinosa-Anke, Steven Schockaert
Grasping the commonsense properties of everyday concepts is an important prerequisite to language understanding.
no code implementations • 3 Aug 2021 • Amit Gajbhiye, Noura Al Moubayed, Steven Bradley
We introduce a new model for NLI called External Knowledge Enhanced BERT (ExBERT), to enrich the contextual representation with real-world commonsense knowledge from external knowledge sources and enhance BERT's language understanding and reasoning capabilities.
1 code implementation • Findings (ACL) 2021 • Amit Gajbhiye, Marina Fomicheva, Fernando Alva-Manchego, Frédéric Blain, Abiola Obamuyide, Nikolaos Aletras, Lucia Specia
Quality Estimation (QE) is the task of automatically predicting Machine Translation quality in the absence of reference translations, making it applicable in real-time settings, such as translating online social media conversations.
no code implementations • 22 Oct 2020 • Amit Gajbhiye, Thomas Winterbottom, Noura Al Moubayed, Steven Bradley
BiCAM incorporates real-world commonsense knowledge into NLI models.
no code implementations • 22 Oct 2018 • Amit Gajbhiye, Sardar Jaf, Noura Al Moubayed, A. Stephen McGough, Steven Bradley
In this paper, we propose a novel RNN model for NLI and empirically evaluate the effect of applying dropout at different layers in the model.