no code implementations • EMNLP 2021 • Ahmet Üstün, Alexandre Bérard, Laurent Besacier, Matthias Gallé
We consider the problem of multilingual unsupervised machine translation, translating to and from languages that only have monolingual data by using auxiliary parallel language pairs.
no code implementations • WMT (EMNLP) 2021 • Asa Cooper Stickland, Alexandre Bérard, Vassilina Nikoulina
In this work we study the compositionality of language and domain adapters in the context of Machine Translation.
1 code implementation • EMNLP (NLP-COVID19) 2020 • Alexandre Bérard, Zae Myung Kim, Vassilina Nikoulina, Eunjeong L. Park, Matthias Gallé
We release a multilingual neural machine translation model, which can be used to translate text in the biomedical domain.
no code implementations • WS 2019 • Fahimeh Saleh, Alexandre Bérard, Ioan Calapodescu, Laurent Besacier
To address these challenges, we propose to leverage data from both tasks and do transfer learning between MT, NLG, and MT with source-side metadata (MT+NLG).
no code implementations • WS 2019 • Alexandre Bérard, Ioan Calapodescu, Marc Dymetman, Claude Roux, Jean-Luc Meunier, Vassilina Nikoulina
We share a French-English parallel corpus of Foursquare restaurant reviews (https://europe. naverlabs. com/research/natural-language-processing/machine-translation-of-restaurant-reviews), and define a new task to encourage research on Neural Machine Translation robustness and domain adaptation, in a real-world scenario where better-quality MT would be greatly beneficial.
no code implementations • WS 2019 • Alexandre Bérard, Ioan Calapodescu, Claude Roux
This paper describes the systems that we submitted to the WMT19 Machine Translation robustness task.
1 code implementation • 12 Feb 2018 • Alexandre Bérard, Laurent Besacier, Ali Can Kocabiyikoglu, Olivier Pietquin
We investigate end-to-end speech-to-text translation on a corpus of audiobooks specifically augmented for this task.