Search Results for author: Joern Wuebker

Found 26 papers, 3 papers with code

Automatic Bilingual Markup Transfer

2 code implementations Findings (EMNLP) 2021 Thomas Zenkel, Joern Wuebker, John DeNero

We describe the task of bilingual markup transfer, which involves placing markup tags from a source sentence into a fixed target translation.

Machine Translation Translation

Automatic Correction of Human Translations

1 code implementation NAACL 2022 Jessy Lin, Geza Kovacs, Aditya Shastry, Joern Wuebker, John DeNero

We show that human errors in TEC exhibit a more diverse range of errors and far fewer translation fluency errors than the MT errors in automatic post-editing datasets, suggesting the need for dedicated TEC models that are specialized to correct human errors.

Automatic Post-Editing Translation

The Impact of Text Presentation on Translator Performance

no code implementations11 Nov 2020 Samuel Läubli, Patrick Simianer, Joern Wuebker, Geza Kovacs, Rico Sennrich, Spence Green

Widely used computer-aided translation (CAT) tools divide documents into segments such as sentences and arrange them in a side-by-side, spreadsheet-like view.


End-to-End Neural Word Alignment Outperforms GIZA++

no code implementations ACL 2020 Thomas Zenkel, Joern Wuebker, John DeNero

Although unnecessary for training neural MT models, word alignment still plays an important role in interactive applications of neural machine translation, such as annotation transfer and lexicon injection.

Machine Translation Translation +1

Measuring Immediate Adaptation Performance for Neural Machine Translation

no code implementations NAACL 2019 Patrick Simianer, Joern Wuebker, John DeNero

Incremental domain adaptation, in which a system learns from the correct output for each input immediately after making its prediction for that input, can dramatically improve system performance for interactive machine translation.

Domain Adaptation Machine Translation +2

Adding Interpretable Attention to Neural Translation Models Improves Word Alignment

1 code implementation31 Jan 2019 Thomas Zenkel, Joern Wuebker, John DeNero

Multi-layer models with multiple attention heads per layer provide superior translation quality compared to simpler and shallower models, but determining what source context is most relevant to each target word is more challenging as a result.

Machine Translation Translation +1

A Comparative Study on Vocabulary Reduction for Phrase Table Smoothing

no code implementations WS 2016 Yunsu Kim, Andreas Guta, Joern Wuebker, Hermann Ney

This work systematically analyzes the smoothing effect of vocabulary reduction for phrase translation models.


Compact Personalized Models for Neural Machine Translation

no code implementations EMNLP 2018 Joern Wuebker, Patrick Simianer, John DeNero

We propose and compare methods for gradient-based domain adaptation of self-attentive neural machine translation models.

Domain Adaptation Machine Translation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.