no code implementations • 18 Oct 2023 • Frithjof Petrick, Christian Herold, Pavel Petrushkov, Shahram Khadivi, Hermann Ney
Finally, we explore language model fusion in the light of recent advancements in large language models.
no code implementations • 27 Sep 2021 • Evgeniia Tokarchuk, Jan Rosendahl, Weiyue Wang, Pavel Petrushkov, Tomer Lancewicki, Shahram Khadivi, Hermann Ney
Pivot-based neural machine translation (NMT) is commonly used in low-resource setups, especially for translation between non-English language pairs.
no code implementations • ACL (IWSLT) 2021 • Evgeniia Tokarchuk, Jan Rosendahl, Weiyue Wang, Pavel Petrushkov, Tomer Lancewicki, Shahram Khadivi, Hermann Ney
Complex natural language applications such as speech translation or pivot translation traditionally rely on cascaded models.
no code implementations • IJCNLP 2019 • Yunsu Kim, Petre Petrov, Pavel Petrushkov, Shahram Khadivi, Hermann Ney
We present effective pre-training strategies for neural machine translation (NMT) using parallel corpora involving a pivot language, i. e., source-pivot and pivot-target, leading to a significant improvement in source-target translation.
no code implementations • IWSLT (EMNLP) 2018 • Shen Yan, Leonard Dahlmann, Pavel Petrushkov, Sanjika Hewavitharana, Shahram Khadivi
Pre-training a model with word weights improves fine-tuning up to 1. 24% BLEU absolute and 1. 64% TER, respectively.
no code implementations • ACL 2018 • Pavel Petrushkov, Shahram Khadivi, Evgeny Matusov
We empirically investigate learning from partial feedback in neural machine translation (NMT), when partial feedback is collected by asking users to highlight a correct chunk of a translation.
no code implementations • EMNLP 2017 • Leonard Dahlmann, Evgeny Matusov, Pavel Petrushkov, Shahram Khadivi
In this paper, we introduce a hybrid search for attention-based neural machine translation (NMT).