Search Results for author: Abubakar Isa

Found 3 papers, 0 papers with code

A Hybrid Approach for Improved Low Resource Neural Machine Translation using Monolingual Data

no code implementations14 Nov 2020 Idris Abdulmumin, Bashir Shehu Galadanci, Abubakar Isa, Habeebah Adamu Kakudi, Ismaila Idris Sinan

Many language pairs are low resource, meaning the amount and/or quality of available parallel data is not sufficient to train a neural machine translation (NMT) model which can reach an acceptable standard of accuracy.

Low-Resource Neural Machine Translation NMT +2

Enhanced back-translation for low resource neural machine translation using self-training

no code implementations4 Jun 2020 Idris Abdulmumin, Bashir Shehu Galadanci, Abubakar Isa

The synthetic data generated by the improved English-German backward model was used to train a forward model which out-performed another forward model trained using standard back-translation by 2. 7 BLEU.

Low-Resource Neural Machine Translation NMT +1

Iterative Batch Back-Translation for Neural Machine Translation: A Conceptual Model

no code implementations26 Nov 2019 Idris Abdulmumin, Bashir Shehu Galadanci, Abubakar Isa

An effective method to generate a large number of parallel sentences for training improved neural machine translation (NMT) systems is the use of back-translations of the target-side monolingual data.

Machine Translation NMT +1

Cannot find the paper you are looking for? You can Submit a new open access paper.