Search Results for author: M. Amin Farajian

Found 10 papers, 1 papers with code

Unbabel's Submission to the WMT2019 APE Shared Task: BERT-based Encoder-Decoder for Automatic Post-Editing

no code implementations WS 2019 António V. Lopes, M. Amin Farajian, Gonçalo M. Correia, Jonay Trenous, André F. T. Martins

Analogously to dual-encoder architectures we develop a BERT-based encoder-decoder (BED) model in which a single pretrained BERT encoder receives both the source src and machine translation tgt strings.

Automatic Post-Editing Decoder +2

Neural vs. Phrase-Based Machine Translation in a Multi-Domain Scenario

no code implementations EACL 2017 M. Amin Farajian, Marco Turchi, Matteo Negri, Nicola Bertoldi, Marcello Federico

State-of-the-art neural machine translation (NMT) systems are generally trained on specific domains by carefully selecting the training sets and applying proper domain adaptation techniques.

Domain Adaptation Machine Translation +2

WAGS: A Beautiful English-Italian Benchmark Supporting Word Alignment Evaluation on Rare Words

no code implementations LREC 2016 Luisa Bentivogli, Mauro Cettolo, M. Amin Farajian, Marcello Federico

This paper presents WAGS (Word Alignment Gold Standard), a novel benchmark which allows extensive evaluation of WA tools on out-of-vocabulary (OOV) and rare words.

Sentence Word Alignment

Cannot find the paper you are looking for? You can Submit a new open access paper.