no code implementations • EAMT 2020 • Maciej Modrzejewski, Miriam Exel, Bianka Buschbeck, Thanh-Le Ha, Alexander Waibel
The correct translation of named entities (NEs) still poses a challenge for conventional neural machine translation (NMT) systems.
no code implementations • EAMT 2020 • Miriam Exel, Bianka Buschbeck, Lauritz Brandt, Simona Doneva
This paper examines approaches to bias a neural machine translation model to adhere to terminology constraints in an industrial setup.
no code implementations • EAMT 2022 • Bianka Buschbeck, Jennifer Mell, Miriam Exel, Matthias Huck
This paper addresses the automatic translation of conversational content in a business context, for example support chat dialogues.
no code implementations • 23 Oct 2023 • Sai Koneru, Miriam Exel, Matthias Huck, Jan Niehues
Building on the LLM's exceptional ability to process and generate lengthy sequences, we also propose extending our approach to document-level translation.
no code implementations • 17 Jul 2023 • Nathaniel Berger, Miriam Exel, Matthias Huck, Stefan Riezler
Supervised learning in Neural Machine Translation (NMT) typically follows a teacher forcing paradigm where reference tokens constitute the conditioning context in the model's prediction, instead of its own previous predictions.
1 code implementation • AACL (WAT) 2020 • Bianka Buschbeck, Miriam Exel
This paper accompanies the software documentation data set for machine translation, a parallel evaluation data set of data originating from the SAP Help Portal, that we released to the machine translation community for research purposes.