1 code implementation • EMNLP 2021 • Raksha Shenoy, Nico Herbig, Antonio Krüger, Josef van Genabith
For helpful quality levels, a visualization reflecting the uncertainty of the QE model is preferred.
1 code implementation • ACL 2021 • Rashad Albo Jamara, Nico Herbig, Antonio Kr{\"u}ger, Josef van Genabith
Here, we present the first study that investigates the usefulness of mid-air hand gestures in combination with the keyboard (GK) for text editing in PE of MT.
no code implementations • ACL 2020 • Nico Herbig, Tim D{\"u}wel, Santanu Pal, Kalliopi Meladaki, Mahsa Monshizadeh, Antonio Kr{\"u}ger, Josef van Genabith
On the other hand, speech and multi-modal combinations of select {\&} speech are considered suitable for replacements and insertions but offer less potential for deletion and reordering.
no code implementations • ACL 2020 • Nico Herbig, Santanu Pal, Tim D{\"u}wel, Kalliopi Meladaki, Mahsa Monshizadeh, Vladislav Hnatovskiy, Antonio Kr{\"u}ger, Josef van Genabith
The shift from traditional translation to post-editing (PE) of machine-translated (MT) text can save time and reduce errors, but it also affects the design of translation interfaces, as the task changes from mainly generating text to correcting errors within otherwise helpful translation proposals.
no code implementations • COLING 2020 • Santanu Pal, Hongfei Xu, Nico Herbig, Sudip Kumar Naskar, Antonio Krueger, Josef van Genabith
In automatic post-editing (APE) it makes sense to condition post-editing (pe) decisions on both the source (src) and the machine translated text (mt) as input.
no code implementations • WS 2019 • Santanu Pal, Hongfei Xu, Nico Herbig, Antonio Kr{\"u}ger, Josef van Genabith
In this paper we present an English{--}German Automatic Post-Editing (APE) system called transference, submitted to the APE Task organized at WMT 2019.
no code implementations • 7 Mar 2019 • Nico Herbig, Santanu Pal, Josef van Genabith, Antonio Krüger
Current advances in machine translation increase the need for translators to switch from traditional translation to post-editing of machine-translated text, a process that saves time and improves quality.
no code implementations • WS 2018 • Santanu Pal, Nico Herbig, Antonio Kr{\"u}ger, Josef van Genabith
The proposed model is an extension of the transformer architecture: two separate self-attention-based encoders encode the machine translation output (mt) and the source (src), followed by a joint encoder that attends over a combination of these two encoded sequences (encsrc and encmt) for generating the post-edited sentence.