1 code implementation • 3 Apr 2024 • Antoine Nzeyimana
An attention augmentation scheme to the transformer model is proposed in a generic form to allow integration of pre-trained language models and also facilitate modeling of word order relationships between the source and target languages.
Data Augmentation Low-Resource Neural Machine Translation +2
no code implementations • 23 Aug 2023 • Antoine Nzeyimana
In this work, we show that using self-supervised pre-training, following a simple curriculum schedule during fine-tuning and using semi-supervised learning to leverage large unlabelled speech data significantly improve speech recognition performance for Kinyarwanda.
no code implementations • 25 Apr 2023 • Antoine Nzeyimana
This paper describes the system entered by the author to the SemEval-2023 Task 12: Sentiment analysis for African languages.
1 code implementation • ACL 2022 • Antoine Nzeyimana, Andre Niyongabo Rubungo
We address these challenges by proposing a simple yet effective two-tier BERT architecture that leverages a morphological analyzer and explicitly represents morphological compositionality.
no code implementations • COLING 2020 • Antoine Nzeyimana
Kinyarwanda, a morphologically rich language, currently lacks tools for automated morphological analysis.