Search Results for author: Stephan Peitz

Found 17 papers, 1 papers with code

State Spaces Aren't Enough: Machine Translation Needs Attention

no code implementations25 Apr 2023 Ali Vardasbi, Telmo Pessoa Pires, Robin M. Schmidt, Stephan Peitz

Structured State Spaces for Sequences (S4) is a recently proposed sequence model with successful applications in various tasks, e. g. vision, language modeling, and audio.

Language Modelling Machine Translation +2

Non-Autoregressive Neural Machine Translation: A Call for Clarity

no code implementations21 May 2022 Robin M. Schmidt, Telmo Pires, Stephan Peitz, Jonas Lööf

Non-autoregressive approaches aim to improve the inference speed of translation models by only requiring a single forward pass to generate the output sequence instead of iteratively producing each predicted token.

Machine Translation Translation

Jointly Learning to Align and Translate with Transformer Models

1 code implementation IJCNLP 2019 Sarthak Garg, Stephan Peitz, Udhyakumar Nallasamy, Matthias Paulik

The state of the art in machine translation (MT) is governed by neural approaches, which typically provide superior translation accuracy over statistical approaches.

Machine Translation Translation +1

Local System Voting Feature for Machine Translation System Combination

no code implementations WS 2015 Markus Freitag, Jan-Thorsten Peter, Stephan Peitz, Minwei Feng, Hermann Ney

In this paper, we enhance the traditional confusion network system combination approach with an additional model trained by a neural network.

Machine Translation Sentence +1

Cannot find the paper you are looking for? You can Submit a new open access paper.