no code implementations • CoNLL (EMNLP) 2021 • Siddique Latif, Inyoung Kim, Ioan Calapodescu, Laurent Besacier
In this paper, we investigate whether we can control prosody directly from the input text, in order to code information related to contrastive focus which emphasizes a specific word that is contrary to the presuppositions of the interlocutor.
no code implementations • WMT (EMNLP) 2020 • Alexandre Berard, Ioan Calapodescu, Vassilina Nikoulina, Jerin Philip
This paper describes Naver Labs Europe’s participation in the Robustness, Chat, and Biomedical Translation tasks at WMT 2020.
no code implementations • 13 Jun 2023 • Edward Gow-Smith, Alexandre Berard, Marcely Zanon Boito, Ioan Calapodescu
This paper presents NAVER LABS Europe's systems for Tamasheq-French and Quechua-Spanish speech translation in the IWSLT 2023 Low-Resource track.
1 code implementation • 21 Oct 2022 • Laurent Besacier, Swen Ribeiro, Olivier Galibert, Ioan Calapodescu
In this paper, we introduce a new and simple method for comparing speech utterances without relying on text transcripts.
no code implementations • Findings (ACL) 2022 • Cheonbok Park, Hantae Kim, Ioan Calapodescu, Hyunchang Cho, Vassilina Nikoulina
Domain Adaptation (DA) of Neural Machine Translation (NMT) model often relies on a pre-trained general NMT model which is adapted to the new domain on a sample of in-domain parallel data.
no code implementations • WS 2019 • Fahimeh Saleh, Alexandre Bérard, Ioan Calapodescu, Laurent Besacier
To address these challenges, we propose to leverage data from both tasks and do transfer learning between MT, NLG, and MT with source-side metadata (MT+NLG).
no code implementations • WS 2019 • Alexandre Bérard, Ioan Calapodescu, Marc Dymetman, Claude Roux, Jean-Luc Meunier, Vassilina Nikoulina
We share a French-English parallel corpus of Foursquare restaurant reviews (https://europe. naverlabs. com/research/natural-language-processing/machine-translation-of-restaurant-reviews), and define a new task to encourage research on Neural Machine Translation robustness and domain adaptation, in a real-world scenario where better-quality MT would be greatly beneficial.
no code implementations • WS 2019 • Alexandre Bérard, Ioan Calapodescu, Claude Roux
This paper describes the systems that we submitted to the WMT19 Machine Translation robustness task.
no code implementations • 24 Dec 2018 • Cong Duy Vu Hoang, Ioan Calapodescu, Marc Dymetman
In previous works, neural sequence models have been shown to improve significantly if external prior knowledge can be provided, for instance by allowing the model to access the embeddings of explicit features during both training and inference.