Low-Resource Neural Machine Translation

22 papers with code • 1 benchmarks • 4 datasets

Low-resource machine translation is the task of machine translation on a low-resource language where large data may not be available.

Most implemented papers

Revisiting Low-Resource Neural Machine Translation: A Case Study

yuekai146/NMT ACL 2019

It has been shown that the performance of neural machine translation (NMT) drops starkly in low-resource conditions, underperforming phrase-based statistical machine translation (PBSMT) and requiring large amounts of auxiliary data to achieve competitive results.

Transfer Learning for Low-Resource Neural Machine Translation

isi-nlp/Zoph_RNN EMNLP 2016

Ensembling and unknown word replacement add another 2 Bleu which brings the NMT performance on low-resource machine translation close to a strong syntax based machine translation (SBMT) system, exceeding its performance on one language pair.

Data Augmentation for Low-Resource Neural Machine Translation

marziehf/DataAugmentationNMT ACL 2017

The quality of a Neural Machine Translation system depends substantially on the availability of sizable parallel corpora.

Bi-Directional Differentiable Input Reconstruction for Low-Resource Neural Machine Translation

xingniu/sockeye NAACL 2019

We aim to better exploit the limited amounts of parallel text available in low-resource settings by introducing a differentiable reconstruction loss for neural machine translation (NMT).

Effective Cross-lingual Transfer of Neural Machine Translation Models without Shared Vocabularies

yunsukim86/sockeye-transfer ACL 2019

Transfer learning or multilingual model is essential for low-resource neural machine translation (NMT), but the applicability is limited to cognate languages by sharing their vocabularies.

Exploiting Out-of-Domain Parallel Data through Multilingual Transfer Learning for Low-Resource Neural Machine Translation

aizhanti/jarunc WS 2019

This paper proposes a novel multilingual multistage fine-tuning approach for low-resource neural machine translation (NMT), taking a challenging Japanese--Russian pair for benchmarking.

Improving Back-Translation with Uncertainty-based Confidence Estimation

THUNLP-MT/UCE4BT IJCNLP 2019

While back-translation is simple and effective in exploiting abundant monolingual corpora to improve low-resource neural machine translation (NMT), the synthetic bilingual corpora generated by NMT models trained on limited authentic bilingual data are inevitably noisy.

Low Resource Neural Machine Translation: A Benchmark for Five African Languages

surafelml/Afro-NMT 31 Mar 2020

Recent advents in Neural Machine Translation (NMT) have shown improvements in low-resource language (LRL) translation tasks.

Language Model Prior for Low-Resource Neural Machine Translation

cbaziotis/lm-prior-for-nmt EMNLP 2020

A common solution is to exploit the knowledge of language models (LM) trained on abundant monolingual data.