1 code implementation • 12 Oct 2024 • David Beauchemin, Zachary Gagnon, Ricahrd Khoury
Large Language Models (LLMs) perform outstandingly in various downstream tasks, and the use of the Retrieval-Augmented Generation (RAG) architecture has been shown to improve performance for legal question answering (Nuruzzaman and Hussain, 2020; Louis et al., 2024).
no code implementations • 20 Nov 2023 • David Beauchemin, Marouane Yassine
Segmenting an address into meaningful components, also known as address parsing, is an essential step in many applications from record linkage to geocoding and package delivery.
1 code implementation • 9 Apr 2023 • David Beauchemin, Richard Khoury
RISC generates look-alike automobile insurance contracts based on the Quebec regulatory insurance form in French and English.
2 code implementations • 27 Aug 2022 • Vincent Primpied, David Beauchemin, Richard Khoury
Measuring a document's complexity level is an open challenge, particularly when one is working on a diverse corpus of documents rather than comparing several documents on a similar topic or working on a language other than English.
2 code implementations • 11 Apr 2022 • David Beauchemin, Julien Laumonier, Yvan Le Ster, Marouane Yassine
Understanding the evolution of job requirements is becoming more important for workers, companies and public organizations to follow the fast transformation of the employment market.
1 code implementation • 7 Dec 2021 • Marouane Yassine, David Beauchemin, François Laviolette, Luc Lamontagne
While these models yield notable results, previous work on neural networks has only focused on parsing addresses from a single source country.
no code implementations • INLG (ACL) 2020 • David Beauchemin, Nicolas Garneau, Eve Gaumond, Pierre-Luc Déziel, Richard Khoury, Luc Lamontagne
Plumitifs (dockets) were initially a tool for law clerks.
3 code implementations • 29 Jun 2020 • Marouane Yassine, David Beauchemin, François Laviolette, Luc Lamontagne
We propose an approach in which we employ subword embeddings and a Recurrent Neural Network architecture to build a single model capable of learning to parse addresses from multiple countries at the same time while taking into account the difference in languages and address formatting systems.
1 code implementation • LREC 2020 • Nicolas Garneau, Mathieu Godbout, David Beauchemin, Audrey Durand, Luc Lamontagne
In this paper, we reproduce the experiments of Artetxe et al. (2018b) regarding the robust self-learning method for fully unsupervised cross-lingual mappings of word embeddings.