Search Results for author: Julian Schamper

Found 8 papers, 1 papers with code

Generalizing Back-Translation in Neural Machine Translation

no code implementations WS 2019 Miguel Graça, Yunsu Kim, Julian Schamper, Shahram Khadivi, Hermann Ney

Back-translation - data augmentation by translating target monolingual data - is a crucial component in modern neural machine translation (NMT).

Data Augmentation Machine Translation +3

Unsupervised Training for Large Vocabulary Translation Using Sparse Lexicon and Word Classes

no code implementations EACL 2017 Yunsu Kim, Julian Schamper, Hermann Ney

We address for the first time unsupervised training for a translation task with hundreds of thousands of vocabulary words.


The RWTH Aachen University English-German and German-English Unsupervised Neural Machine Translation Systems for WMT 2018

no code implementations WS 2018 Miguel Gra{\c{c}}a, Yunsu Kim, Julian Schamper, Jiahui Geng, Hermann Ney

This paper describes the unsupervised neural machine translation (NMT) systems of the RWTH Aachen University developed for the English ↔ German news translation task of the \textit{EMNLP 2018 Third Conference on Machine Translation} (WMT 2018).

Machine Translation NMT +2

The RWTH Aachen University Supervised Machine Translation Systems for WMT 2018

1 code implementation WS 2018 Julian Schamper, Jan Rosendahl, Parnia Bahar, Yunsu Kim, Arne Nix, Hermann Ney

In total we improve by 6. 8{\%} BLEU over our last year{'}s submission and by 4. 8{\%} BLEU over the winning system of the 2017 German→English task.

Machine Translation Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.