Search Results for author: Raheel Qader

Found 13 papers, 0 papers with code

Controllable Neural Natural Language Generation: comparison of state-of-the-art control strategies

no code implementations ACL (WebNLG, INLG) 2020 Yuanmin Leng, François Portet, Cyril Labbé, Raheel Qader

In this paper, we study different strategies of control in triple-totext generation systems particularly from the aspects of text structure and text length.

TAG Text Generation

Encouraging Neural Machine Translation to Satisfy Terminology Constraints.

no code implementations JEP/TALN/RECITAL 2022 Melissa Ailem, Jingshu Liu, Raheel Qader

Empirical results show that our method improves upon related baselines in terms of both BLEU score and the percentage of generated constraint terms.

Machine Translation Translation

Large Language Model Adaptation for Financial Sentiment Analysis

no code implementations26 Jan 2024 Pau Rodriguez Inserte, Mariam Nakhlé, Raheel Qader, Gaetan Caillaut, Jingshu Liu

We show that through careful fine-tuning on both financial documents and instructions, these foundation models can be adapted to the target domain.

Language Modelling Large Language Model +2

Lingua Custodia's participation at the WMT 2021 Machine Translation using Terminologies shared task

no code implementations3 Nov 2021 Melissa Ailem, Jinghsu Liu, Raheel Qader

This paper describes Lingua Custodia's submission to the WMT21 shared task on machine translation using terminologies.

Machine Translation Translation

Seq2SeqPy: A Lightweight and Customizable Toolkit for Neural Sequence-to-Sequence Modeling

no code implementations LREC 2020 Raheel Qader, Fran{\c{c}}ois Portet, Cyril Labbe

We present Seq2SeqPy a lightweight toolkit for sequence-to-sequence modeling that prioritizes simplicity and ability to customize the standard architectures easily.

Fine-Grained Control of Sentence Segmentation and Entity Positioning in Neural NLG

no code implementations WS 2019 Kritika Mehta, Raheel Qader, Cyril Labbe, Fran{\c{c}}ois Portet

The results demonstrate that the position identifiers do constraint the neural model to respect the intended output structure which can be useful in a variety of domains that require the generated text to be in a certain structure.

Data-to-Text Generation Position +2

Ajout automatique de disfluences pour la synth\`ese de la parole spontan\'ee : formalisation et preuve de concept (Automatic disfluency insertion towards spontaneous TTS : formalization and proof of concept)

no code implementations JEPTALNRECITAL 2017 Raheel Qader, Gw{\'e}nol{\'e} Lecorv{\'e}, Damien Lolive, Pascale S{\'e}billot

Cet article pr{\'e}sente un travail exploratoire sur l{'}ajout automatique de disfluences, c{'}est-{\`a}-dire de pauses, de r{\'e}p{\'e}titions et de r{\'e}visions, dans les {\'e}nonc{\'e}s en entr{\'e}e d{'}un syst{\`e}me de synth{\`e}se de la parole.

Cannot find the paper you are looking for? You can Submit a new open access paper.