Search Results for author: Qu Cui

Found 5 papers, 1 papers with code

NJU’s submission to the WMT20 QE Shared Task

no code implementations WMT (EMNLP) 2020 Qu Cui, Xiang Geng, ShuJian Huang, Jiajun Chen

This paper describes our system of the sentence-level and word-level Quality Estimation Shared Task of WMT20.

Language Modelling Sentence

Simple and Scalable Nearest Neighbor Machine Translation

1 code implementation23 Feb 2023 Yuhan Dai, Zhirui Zhang, Qiuzhi Liu, Qu Cui, Weihua Li, Yichao Du, Tong Xu

$k$NN-MT is a straightforward yet powerful approach for fast domain adaptation, which directly plugs pre-trained neural machine translation (NMT) models with domain-specific token-level $k$-nearest-neighbor ($k$NN) retrieval to achieve domain adaptation without retraining.

Domain Adaptation Machine Translation +4

DirectQE: Direct Pretraining for Machine Translation Quality Estimation

no code implementations15 May 2021 Qu Cui, ShuJian Huang, Jiahuan Li, Xiang Geng, Zaixiang Zheng, Guoping Huang, Jiajun Chen

However, we argue that there are gaps between the predictor and the estimator in both data quality and training objectives, which preclude QE models from benefiting from a large number of parallel corpora more directly.

Machine Translation Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.