Low Resource NMT

11 papers with code • 0 benchmarks • 0 datasets

This task has no description! Would you like to contribute one?

Most implemented papers

Revisiting Low-Resource Neural Machine Translation: A Case Study

ewdowiak/Sicilian_Translator ACL 2019

It has been shown that the performance of neural machine translation (NMT) drops starkly in low-resource conditions, underperforming phrase-based statistical machine translation (PBSMT) and requiring large amounts of auxiliary data to achieve competitive results.

On Optimal Transformer Depth for Low-Resource Language Translation

ElanVB/optimal_transformer_depth 9 Apr 2020

Therefore, by showing that transformer models perform well (and often best) at low-to-moderate depth, we hope to convince fellow researchers to devote less computational resources, as well as time, to exploring overly large models during the development of these systems.

JASS: Japanese-specific Sequence to Sequence Pre-training for Neural Machine Translation

Mao-KU/JASS LREC 2020

Monolingual pre-training approaches such as MASS (MAsked Sequence to Sequence) are extremely effective in boosting NMT quality for languages with small parallel corpora.

Sicilian Translator: A Recipe for Low-Resource NMT

ewdowiak/Sicilian_Translator 5 Oct 2021

With 17, 000 pairs of Sicilian-English translated sentences, Arba Sicula developed the first neural machine translator for the Sicilian language.

Towards Better Chinese-centric Neural Machine Translation for Low-resource Languages

wengsyx/low-resource-text-translation 9 Apr 2022

The last decade has witnessed enormous improvements in science and technology, stimulating the growing demand for economic and cultural exchanges in various countries.

ConsistTL: Modeling Consistency in Transfer Learning for Low-Resource Neural Machine Translation

nlp2ct/consisttl 8 Dec 2022

In this paper, we propose a novel transfer learning method for NMT, namely ConsistTL, which can continuously transfer knowledge from the parent model during the training of the child model.

Joint Dropout: Improving Generalizability in Low-Resource Neural Machine Translation through Phrase Pair Variables

aliaraabi/joint_dropout 24 Jul 2023

Despite the tremendous success of Neural Machine Translation (NMT), its performance on low-resource language pairs still remains subpar, partly due to the limited ability to handle previously unseen inputs, i. e., generalization.

Low-resource neural machine translation with morphological modeling

anzeyimana/kinmt_naacl2024 3 Apr 2024

An attention augmentation scheme to the transformer model is proposed in a generic form to allow integration of pre-trained language models and also facilitate modeling of word order relationships between the source and target languages.

From Priest to Doctor: Domain Adaptaion for Low-Resource Neural Machine Translation

alimrsn79/da_lr_nmt 1 Dec 2024

Many of the world's languages have insufficient data to train high-performing general neural machine translation (NMT) models, let alone domain-specific models, and often the only available parallel data are small amounts of religious texts.