Search Results for author: Andre Tättar

Found 6 papers, 2 papers with code

ParBLEU: Augmenting Metrics with Automatic Paraphrases for the WMT’20 Metrics Shared Task

no code implementations WMT (EMNLP) 2020 Rachel Bawden, Biao Zhang, Andre Tättar, Matt Post

We describe parBLEU, parCHRF++, and parESIM, which augment baseline metrics with automatically generated paraphrases produced by PRISM (Thompson and Post, 2020a), a multilingual neural machine translation system.

Machine Translation Translation

Multilingual Neural Machine Translation With the Right Amount of Sharing

no code implementations EAMT 2022 Taido Purason, Andre Tättar

Large multilingual Transformer-based machine translation models have had a pivotal role in making translation systems available for hundreds of languages with good zero-shot translation performance.

Machine Translation NMT +1

Extremely low-resource machine translation for closely related languages

no code implementations NoDaLiDa 2021 Maali Tars, Andre Tättar, Mark Fišel

An effective method to improve extremely low-resource neural machine translation is multilingual training, which can be improved by leveraging monolingual data to create synthetic bilingual corpora using the back-translation method.

Low-Resource Neural Machine Translation Transfer Learning +1

Correlated daily time series and forecasting in the M4 competition

1 code implementation28 Mar 2020 Anti Ingel, Novin Shahroudi, Markus Kängsepp, Andre Tättar, Viacheslav Komisarenko, Meelis Kull

We participated in the M4 competition for time series forecasting and describe here our methods for forecasting daily time series.

Time Series Time Series Forecasting

Cannot find the paper you are looking for? You can Submit a new open access paper.