no code implementations • WMT (EMNLP) 2020 • Rachel Bawden, Biao Zhang, Andre Tättar, Matt Post
We describe parBLEU, parCHRF++, and parESIM, which augment baseline metrics with automatically generated paraphrases produced by PRISM (Thompson and Post, 2020a), a multilingual neural machine translation system.
no code implementations • EAMT 2022 • Taido Purason, Andre Tättar
Large multilingual Transformer-based machine translation models have had a pivotal role in making translation systems available for hundreds of languages with good zero-shot translation performance.
no code implementations • EAMT 2022 • Toms Bergmanis, Marcis Pinnis, Roberts Rozis, Jānis Šlapiņš, Valters Šics, Berta Bernāne, Guntars Pužulis, Endijs Titomers, Andre Tättar, Taido Purason, Hele-Andra Kuulmets, Agnes Luhtaru, Liisa Rätsep, Maali Tars, Annika Laumets-Tättar, Mark Fishel
We present the MTee project - a research initiative funded via an Estonian public procurement to develop machine translation technology that is open-source and free of charge.
no code implementations • NoDaLiDa 2021 • Maali Tars, Andre Tättar, Mark Fišel
An effective method to improve extremely low-resource neural machine translation is multilingual training, which can be improved by leveraging monolingual data to create synthetic bilingual corpora using the back-translation method.
Low-Resource Neural Machine Translation Transfer Learning +1
1 code implementation • Findings of the Association for Computational Linguistics 2020 • Rachel Bawden, Biao Zhang, Lisa Yankovskaya, Andre Tättar, Matt Post
We investigate a long-perceived shortcoming in the typical use of BLEU: its reliance on a single reference.
1 code implementation • 28 Mar 2020 • Anti Ingel, Novin Shahroudi, Markus Kängsepp, Andre Tättar, Viacheslav Komisarenko, Meelis Kull
We participated in the M4 competition for time series forecasting and describe here our methods for forecasting daily time series.