no code implementations • 25 Jan 2024 • Richard Kimera, Daniela N. Rim, Joseph Kirabira, Ubong Godwin Udomah, Heeyoul Choi
Depression is a global burden and one of the most challenging mental health conditions to control.
no code implementations • 16 Aug 2023 • Daniela N. Rim, Kimera Richard, Heeyoul Choi
The Transformer model has revolutionized Natural Language Processing tasks such as Neural Machine Translation, and many efforts have been made to study the Transformer architecture, which increased its efficiency and accuracy.
no code implementations • 7 Jan 2023 • Richard Kimera, Daniela N. Rim, Heeyoul Choi
Neural machine translation (NMT) has achieved great successes with large datasets, so NMT is more premised on high-resource languages.
no code implementations • 19 Sep 2021 • Daniela N. Rim, DongNyeong Heo, Heeyoul Choi
In this work, we propose adversarial training with contrastive learning (ATCL) to adversarially train a language processing task using the benefits of contrastive learning.
no code implementations • 25 May 2021 • Daniela N. Rim, Inseon Jang, Heeyoul Choi
We apply a reparametrization trick for the Bernoulli distribution for the discrete representations, which allows smooth backpropagation.