Search Results for author: Ioan Calapodescu

Found 11 papers, 1 papers with code

Controlling Prosody in End-to-End TTS: A Case Study on Contrastive Focus Generation

no code implementations CoNLL (EMNLP) 2021 Siddique Latif, Inyoung Kim, Ioan Calapodescu, Laurent Besacier

In this paper, we investigate whether we can control prosody directly from the input text, in order to code information related to contrastive focus which emphasizes a specific word that is contrary to the presuppositions of the interlocutor.

NAVER LABS Europe's Multilingual Speech Translation Systems for the IWSLT 2023 Low-Resource Track

no code implementations13 Jun 2023 Edward Gow-Smith, Alexandre Berard, Marcely Zanon Boito, Ioan Calapodescu

This paper presents NAVER LABS Europe's systems for Tamasheq-French and Quechua-Spanish speech translation in the IWSLT 2023 Low-Resource track.

Translation

A Textless Metric for Speech-to-Speech Comparison

1 code implementation21 Oct 2022 Laurent Besacier, Swen Ribeiro, Olivier Galibert, Ioan Calapodescu

In this paper, we introduce a new and simple method for comparing speech utterances without relying on text transcripts.

Sentence Speech-to-Speech Translation +1

DaLC: Domain Adaptation Learning Curve Prediction for Neural Machine Translation

no code implementations Findings (ACL) 2022 Cheonbok Park, Hantae Kim, Ioan Calapodescu, Hyunchang Cho, Vassilina Nikoulina

Domain Adaptation (DA) of Neural Machine Translation (NMT) model often relies on a pre-trained general NMT model which is adapted to the new domain on a sample of in-domain parallel data.

Domain Adaptation Machine Translation +2

Naver Labs Europe's Systems for the Document-Level Generation and Translation Task at WNGT 2019

no code implementations WS 2019 Fahimeh Saleh, Alexandre Bérard, Ioan Calapodescu, Laurent Besacier

To address these challenges, we propose to leverage data from both tasks and do transfer learning between MT, NLG, and MT with source-side metadata (MT+NLG).

Descriptive Machine Translation +4

Machine Translation of Restaurant Reviews: New Corpus for Domain Adaptation and Robustness

no code implementations WS 2019 Alexandre Bérard, Ioan Calapodescu, Marc Dymetman, Claude Roux, Jean-Luc Meunier, Vassilina Nikoulina

We share a French-English parallel corpus of Foursquare restaurant reviews (https://europe. naverlabs. com/research/natural-language-processing/machine-translation-of-restaurant-reviews), and define a new task to encourage research on Neural Machine Translation robustness and domain adaptation, in a real-world scenario where better-quality MT would be greatly beneficial.

Domain Adaptation Machine Translation +2

Moment Matching Training for Neural Machine Translation: A Preliminary Study

no code implementations24 Dec 2018 Cong Duy Vu Hoang, Ioan Calapodescu, Marc Dymetman

In previous works, neural sequence models have been shown to improve significantly if external prior knowledge can be provided, for instance by allowing the model to access the embeddings of explicit features during both training and inference.

Machine Translation Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.