no code implementations • ACL 2022 • Yanling Xiao, Lemao Liu, Guoping Huang, Qu Cui, ShuJian Huang, Shuming Shi, Jiajun Chen
In this work, we propose a novel BiTIIMT system, Bilingual Text-Infilling for Interactive Neural Machine Translation.
no code implementations • WMT (EMNLP) 2020 • Qu Cui, Xiang Geng, ShuJian Huang, Jiajun Chen
This paper describes our system of the sentence-level and word-level Quality Estimation Shared Task of WMT20.
1 code implementation • 23 Feb 2023 • Yuhan Dai, Zhirui Zhang, Qiuzhi Liu, Qu Cui, Weihua Li, Yichao Du, Tong Xu
$k$NN-MT is a straightforward yet powerful approach for fast domain adaptation, which directly plugs pre-trained neural machine translation (NMT) models with domain-specific token-level $k$-nearest-neighbor ($k$NN) retrieval to achieve domain adaptation without retraining.
no code implementations • ACL 2021 • Qiuxiang He, Guoping Huang, Qu Cui, Li Li, Lemao Liu
It is generally believed that a translation memory (TM) should be beneficial for machine translation tasks.
no code implementations • 15 May 2021 • Qu Cui, ShuJian Huang, Jiahuan Li, Xiang Geng, Zaixiang Zheng, Guoping Huang, Jiajun Chen
However, we argue that there are gaps between the predictor and the estimator in both data quality and training objectives, which preclude QE models from benefiting from a large number of parallel corpora more directly.