Search Results for author: David Vilar

Found 16 papers, 5 papers with code

Bandits Don’t Follow Rules: Balancing Multi-Facet Machine Translation with Multi-Armed Bandits

no code implementations Findings (EMNLP) 2021 Julia Kreutzer, David Vilar, Artem Sokolov

Training data for machine translation (MT) is often sourced from a multitude of large corpora that are multi-faceted in nature, e. g. containing contents from multiple domains or different levels of quality or complexity.

Machine Translation Multi-Armed Bandits +1

Controlling Machine Translation for Multiple Attributes with Additive Interventions

no code implementations EMNLP 2021 Andrea Schioppa, David Vilar, Artem Sokolov, Katja Filippova

Fine-grained control of machine translation (MT) outputs along multiple attributes is critical for many modern MT applications and is a requirement for gaining users’ trust.

Fine-tuning Machine Translation +1

Sockeye 2: A Toolkit for Neural Machine Translation

1 code implementation EAMT 2020 Felix Hieber, Tobias Domhan, Michael Denkowski, David Vilar

We present Sockeye 2, a modernized and streamlined version of the Sockeye neural machine translation (NMT) toolkit.

Machine Translation Translation

A Statistical Extension of Byte-Pair Encoding

1 code implementation ACL (IWSLT) 2021 David Vilar, Marcello Federico

Sub-word segmentation is currently a standard tool for training neural machine translation (MT) systems and other NLP tasks.

Data Compression Machine Translation +1

Bandits Don't Follow Rules: Balancing Multi-Facet Machine Translation with Multi-Armed Bandits

no code implementations13 Oct 2021 Julia Kreutzer, David Vilar, Artem Sokolov

Training data for machine translation (MT) is often sourced from a multitude of large corpora that are multi-faceted in nature, e. g. containing contents from multiple domains or different levels of quality or complexity.

Machine Translation Multi-Armed Bandits +1

Fast Lexically Constrained Decoding with Dynamic Beam Allocation for Neural Machine Translation

no code implementations NAACL 2018 Matt Post, David Vilar

The end-to-end nature of neural machine translation (NMT) removes many ways of manually guiding the translation process that were available in older paradigms.

Machine Translation Translation

Sockeye: A Toolkit for Neural Machine Translation

15 code implementations15 Dec 2017 Felix Hieber, Tobias Domhan, Michael Denkowski, David Vilar, Artem Sokolov, Ann Clifton, Matt Post

Written in Python and built on MXNet, the toolkit offers scalable training and inference for the three most prominent encoder-decoder architectures: attentional recurrent neural networks, self-attentional transformers, and fully convolutional networks.

Machine Translation Translation

The taraX\"U corpus of human-annotated machine translations

no code implementations LREC 2014 Eleftherios Avramidis, Aljoscha Burchardt, Sabine Hunsicker, Maja Popovi{\'c}, Cindy Tscherwinka, David Vilar, Hans Uszkoreit

Human translators are the key to evaluating machine translation (MT) quality and also to addressing the so far unanswered question when and how to use MT in professional translation workflows.

General Classification Machine Translation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.