no code implementations • WMT (EMNLP) 2021 • Jeonghyeok Park, Hyunjoong Kim, Hyunchang Cho
The provided parallel data are Russian-Chinese (direct), Russian-English (indirect), and English-Chinese (indirect) data.
no code implementations • 24 Oct 2022 • Jiyoung Lee, Hantae Kim, Hyunchang Cho, Edward Choi, Cheonbok Park
Multi-domain Neural Machine Translation (NMT) trains a single model with multiple domains.
no code implementations • Findings (ACL) 2022 • Cheonbok Park, Hantae Kim, Ioan Calapodescu, Hyunchang Cho, Vassilina Nikoulina
Domain Adaptation (DA) of Neural Machine Translation (NMT) model often relies on a pre-trained general NMT model which is adapted to the new domain on a sample of in-domain parallel data.
1 code implementation • 6 Jul 2021 • Won Ik Cho, Seok Min Kim, Hyunchang Cho, Nam Soo Kim
Most speech-to-text (S2T) translation studies use English speech as a source, which makes it difficult for non-English speakers to take advantage of the S2T technologies.
no code implementations • EAMT 2020 • Jihyung Moon, Hyunchang Cho, Eunjeong L. Park
Quality estimation (QE) is the task of automatically evaluating the quality of translations without human-translated references.