Search Results for author: Parnia Bahar

Found 17 papers, 3 papers with code

The RWTH Aachen Machine Translation Systems for IWSLT 2017

no code implementations IWSLT 2017 Parnia Bahar, Jan Rosendahl, Nick Rossenbach, Hermann Ney

This work describes the Neural Machine Translation (NMT) system of the RWTH Aachen University developed for the English$German tracks of the evaluation campaign of the International Workshop on Spoken Language Translation (IWSLT) 2017.

Domain Adaptation Machine Translation +2

Automatic Video Dubbing at AppTek

no code implementations EAMT 2022 Mattia Di Gangi, Nick Rossenbach, Alejandro Pérez, Parnia Bahar, Eugen Beck, Patrick Wilken, Evgeny Matusov

The revoicing usually comes with a changed script, mostly in a different language, and the revoicing should reproduce the original emotions, coherent with the body language, and lip synchronized.

Take the Hint: Improving Arabic Diacritization with Partially-Diacritized Text

1 code implementation6 Jun 2023 Parnia Bahar, Mattia Di Gangi, Nick Rossenbach, Mohammad Zeineldeen

Automatic Arabic diacritization is useful in many applications, ranging from reading support for language learners to accurate pronunciation predictor for downstream tasks like speech synthesis.

Speech Synthesis

Tight Integrated End-to-End Training for Cascaded Speech Translation

no code implementations24 Nov 2020 Parnia Bahar, Tobias Bieschke, Ralf Schlüter, Hermann Ney

Direct speech translation is an alternative method to avoid error propagation; however, its performance is often behind the cascade system.


Two-Way Neural Machine Translation: A Proof of Concept for Bidirectional Translation Modeling using a Two-Dimensional Grid

no code implementations24 Nov 2020 Parnia Bahar, Christopher Brix, Hermann Ney

Neural translation models have proven to be effective in capturing sufficient information from a source sentence and generating a high-quality target sentence.

Machine Translation Sentence +2

Successfully Applying the Stabilized Lottery Ticket Hypothesis to the Transformer Architecture

no code implementations ACL 2020 Christopher Brix, Parnia Bahar, Hermann Ney

Sparse models require less memory for storage and enable a faster inference by reducing the necessary number of FLOPs.

A Comparative Study on End-to-end Speech to Text Translation

no code implementations20 Nov 2019 Parnia Bahar, Tobias Bieschke, Hermann Ney

Recent advances in deep learning show that end-to-end speech to text translation model is a promising approach to direct the speech translation field.

Speech-to-Text Translation Translation

The RWTH Aachen University Machine Translation Systems for WMT 2019

no code implementations WS 2019 Jan Rosendahl, Christian Herold, Yunsu Kim, Miguel Gra{\c{c}}a, Weiyue Wang, Parnia Bahar, Yingbo Gao, Hermann Ney

For the De-En task, none of the tested methods gave a significant improvement over last years winning system and we end up with the same performance, resulting in 39. 6{\%} BLEU on newstest2019.

Attribute Language Modelling +3

Towards Two-Dimensional Sequence to Sequence Model in Neural Machine Translation

1 code implementation EMNLP 2018 Parnia Bahar, Christopher Brix, Hermann Ney

This work investigates an alternative model for neural machine translation (NMT) and proposes a novel architecture, where we employ a multi-dimensional long short-term memory (MDLSTM) for translation modeling.

Machine Translation NMT +2

The RWTH Aachen University Supervised Machine Translation Systems for WMT 2018

1 code implementation WS 2018 Julian Schamper, Jan Rosendahl, Parnia Bahar, Yunsu Kim, Arne Nix, Hermann Ney

In total we improve by 6. 8{\%} BLEU over our last year{'}s submission and by 4. 8{\%} BLEU over the winning system of the 2017 German→English task.

Machine Translation Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.