1 code implementation • LREC 2022 • Harritxu Gete, Thierry Etchegoyhen, David Ponce, Gorka Labaka, Nora Aranberri, Ander Corral, Xabier Saralegi, Igor Ellakuria, Maite Martin
Document-level Neural Machine Translation aims to increase the quality of neural translation models by taking into account contextual information.
no code implementations • RANLP 2021 • Thierry Etchegoyhen, David Ponce, Harritxu Gete, Victor Ruiz
Adaptive Machine Translation purports to dynamically include user feedback to improve translation quality.
no code implementations • 18 Jun 2024 • Harritxu Gete, Thierry Etchegoyhen
Neural Machine Translation models tend to perpetuate gender bias present in their training data distribution.
no code implementations • 9 Feb 2024 • Harritxu Gete, Thierry Etchegoyhen
Standard context-aware neural machine translation (NMT) typically relies on parallel document-level data, exploiting both source and target contexts.
no code implementations • 18 Dec 2023 • David Ponce, Thierry Etchegoyhen, Jesús Calleja Pérez, Harritxu Gete
Our results provide a fine-grained analysis of the potential and limitations of large language models for SPRP, with significant improvements achievable using relatively small amounts of training data and model parameters overall, and remaining limitations for all models on the task.
no code implementations • LREC 2020 • Thierry Etchegoyhen, Harritxu Gete
We present the results of a case study in the exploitation of comparable corpora for Neural Machine Translation.
no code implementations • LREC 2020 • Thierry Etchegoyhen, Harritxu Gete
We present a comparative evaluation of casing methods for Neural Machine Translation, to help establish an optimal pre- and post-processing methodology.