Search Results for author: Tobias Domhan

Found 12 papers, 11 papers with code

Sockeye: A Toolkit for Neural Machine Translation

16 code implementations15 Dec 2017 Felix Hieber, Tobias Domhan, Michael Denkowski, David Vilar, Artem Sokolov, Ann Clifton, Matt Post

Written in Python and built on MXNet, the toolkit offers scalable training and inference for the three most prominent encoder-decoder architectures: attentional recurrent neural networks, self-attentional transformers, and fully convolutional networks.

Machine Translation NMT +1

Image Captioning as Neural Machine Translation Task in SOCKEYE

1 code implementation9 Oct 2018 Loris Bazzani, Tobias Domhan, Felix Hieber

Image captioning is an interdisciplinary research problem that stands between computer vision and natural language processing.

Image Captioning Machine Translation +2

How Much Attention Do You Need? A Granular Analysis of Neural Machine Translation Architectures

1 code implementation ACL 2018 Tobias Domhan

With recent advances in network architectures for Neural Machine Translation (NMT) recurrent models have effectively been replaced by either convolutional or self-attentional approaches, such as in the Transformer.

Machine Translation NMT +1

Using Target-side Monolingual Data for Neural Machine Translation through Multi-task Learning

1 code implementation EMNLP 2017 Tobias Domhan, Felix Hieber

The performance of Neural Machine Translation (NMT) models relies heavily on the availability of sufficient amounts of parallel data, and an efficient and effective way of leveraging the vastly available amounts of monolingual data has yet to be found.

Language Modelling Machine Translation +3

Sockeye 2: A Toolkit for Neural Machine Translation

1 code implementation EAMT 2020 Felix Hieber, Tobias Domhan, Michael Denkowski, David Vilar

We present Sockeye 2, a modernized and streamlined version of the Sockeye neural machine translation (NMT) toolkit.

Machine Translation NMT +1

The Devil is in the Details: On the Pitfalls of Vocabulary Selection in Neural Machine Translation

1 code implementation NAACL 2022 Tobias Domhan, Eva Hasler, Ke Tran, Sony Trenous, Bill Byrne, Felix Hieber

Vocabulary selection, or lexical shortlisting, is a well-known technique to improve latency of Neural Machine Translation models by constraining the set of allowed output words during inference.

Machine Translation Sentence +1

A Shocking Amount of the Web is Machine Translated: Insights from Multi-Way Parallelism

1 code implementation11 Jan 2024 Brian Thompson, Mehak Preet Dhaliwal, Peter Frisch, Tobias Domhan, Marcello Federico

We show that content on the web is often translated into many languages, and the low quality of these multi-way translations indicates they were likely created using Machine Translation (MT).

Machine Translation Selection bias

Trained MT Metrics Learn to Cope with Machine-translated References

1 code implementation1 Dec 2023 Jannis Vamvas, Tobias Domhan, Sony Trenous, Rico Sennrich, Eva Hasler

Neural metrics trained on human evaluations of MT tend to correlate well with human judgments, but their behavior is not fully understood.

Cannot find the paper you are looking for? You can Submit a new open access paper.