Search Results for author: Dario Stojanovski

Found 14 papers, 3 papers with code

Improving Machine Translation of Rare and Unseen Word Senses

no code implementations WMT (EMNLP) 2021 Viktor Hangya, Qianchu Liu, Dario Stojanovski, Alexander Fraser, Anna Korhonen

The performance of NMT systems has improved drastically in the past few years but the translation of multi-sense words still poses a challenge.

Bilingual Lexicon Induction NMT +3

Language-Family Adapters for Low-Resource Multilingual Neural Machine Translation

no code implementations30 Sep 2022 Alexandra Chronopoulou, Dario Stojanovski, Alexander Fraser

Training a new adapter on each language pair or training a single adapter on all language pairs without updating the pretrained model has been proposed as a parameter-efficient alternative.

Cross-Lingual Transfer Machine Translation +1

Improving the Lexical Ability of Pretrained Language Models for Unsupervised Neural Machine Translation

1 code implementation NAACL 2021 Alexandra Chronopoulou, Dario Stojanovski, Alexander Fraser

Successful methods for unsupervised neural machine translation (UNMT) employ crosslingual pretraining via self-supervision, often in the form of a masked language modeling or a sequence generation task, which requires the model to align the lexical- and high-level representations of the two languages.

Bilingual Lexicon Induction Language Modelling +2

ContraCAT: Contrastive Coreference Analytical Templates for Machine Translation

no code implementations COLING 2020 Dario Stojanovski, Benno Krojer, Denis Peskov, Alexander Fraser

Recent high scores on pronoun translation using context-aware neural machine translation have suggested that current approaches work well.

Machine Translation NMT +1

The LMU Munich System for the WMT 2020 Unsupervised Machine Translation Shared Task

1 code implementation WMT (EMNLP) 2020 Alexandra Chronopoulou, Dario Stojanovski, Viktor Hangya, Alexander Fraser

Our core unsupervised neural machine translation (UNMT) system follows the strategy of Chronopoulou et al. (2020), using a monolingual pretrained language generation model (on German) and fine-tuning it on both German and Upper Sorbian, before initializing a UNMT model, which is trained with online backtranslation.

Text Generation Translation +1

Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT

1 code implementation EMNLP 2020 Alexandra Chronopoulou, Dario Stojanovski, Alexander Fraser

Using a language model (LM) pretrained on two languages with large monolingual data in order to initialize an unsupervised neural machine translation (UNMT) system yields state-of-the-art results.

Language Modelling Machine Translation +2

The LMU Munich Unsupervised Machine Translation System for WMT19

no code implementations WS 2019 Dario Stojanovski, Viktor Hangya, Matthias Huck, Alex Fraser, er

We describe LMU Munich{'}s machine translation system for German→Czech translation which was used to participate in the WMT19 shared task on unsupervised news translation.

Denoising Language Modelling +3

Combining Local and Document-Level Context: The LMU Munich Neural Machine Translation System at WMT19

no code implementations WS 2019 Dario Stojanovski, Alex Fraser, er

We describe LMU Munich{'}s machine translation system for English→German translation which was used to participate in the WMT19 shared task on supervised news translation.

Machine Translation Sentence +1

Coreference and Coherence in Neural Machine Translation: A Study Using Oracle Experiments

no code implementations WS 2018 Dario Stojanovski, Alex Fraser, er

We show that NMT models taking advantage of context oracle signals can achieve considerable gains in BLEU, of up to 7. 02 BLEU for coreference and 1. 89 BLEU for coherence on subtitles translation.

Coreference Resolution Language Modelling +4

LMU Munich's Neural Machine Translation Systems at WMT 2018

no code implementations WS 2018 Matthias Huck, Dario Stojanovski, Viktor Hangya, Alex Fraser, er

The systems were used for our participation in the WMT18 biomedical translation task and in the shared task on machine translation of news.

Domain Adaptation Translation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.