no code implementations • 7 Jun 2024 • Megh Thakkar, Quentin Fournier, Matthew D Riemer, Pin-Yu Chen, Amal Zouaq, Payel Das, Sarath Chandar
Large language models are first pre-trained on trillions of tokens and then instruction-tuned or aligned to specific preferences.
no code implementations • 16 Apr 2023 • Papa Abdou Karim Karou Diallo, Samuel Reyd, Amal Zouaq
Our study demonstrates that the copy mechanism yields significant performance enhancements for most PLMs and NPLMs.
1 code implementation • 18 Nov 2022 • Rose Hirigoyen, Amal Zouaq, Samuel Reyd
However, current architectures are unable to integrate the knowledge base (KB) schema and handle questions on knowledge resources, classes, and properties unseen during training, rendering them unusable outside the scope of topics covered in the training set.
no code implementations • 9 Nov 2022 • Louis Clouâtre, Prasanna Parthasarathi, Amal Zouaq, Sarath Chandar
In this work, we replicate a study on the importance of local structure, and the relative unimportance of global structure, in a multilingual setting.
no code implementations • 9 Nov 2022 • Louis Clouâtre, Prasanna Parthasarathi, Amal Zouaq, Sarath Chandar
However, this transfer is not universal, with many languages not currently understood by multilingual approaches.
no code implementations • 2 Feb 2022 • Amine Barrak, Bram Adams, Amal Zouaq
Typically, pre-trained language models use transfer-based machine learning models to be fine-tuned for a specific field.
no code implementations • Findings (ACL) 2022 • Louis Clouatre, Prasanna Parthasarathi, Amal Zouaq, Sarath Chandar
Recent research analyzing the sensitivity of natural language understanding models to word-order perturbations has shown that neural models are surprisingly insensitive to the order of words.
no code implementations • Findings (ACL) 2021 • Louis Clouatre, Philippe Trempe, Amal Zouaq, Sarath Chandar
They however scale with man-hours and high-quality data.
Ranked #12 on Link Prediction on WN18RR (using extra training data)
no code implementations • LREC 2020 • Alex Bento, re, Amal Zouaq, Michel Gagnon
In order to achieve interoperability of information in the context of the Semantic Web, it is necessary to find effective ways to align different ontologies.