no code implementations • WMT (EMNLP) 2020 • Jiayi Wang, Ke Wang, Kai Fan, Yuqi Zhang, Jun Lu, Xin Ge, Yangbin Shi, Yu Zhao
We also apply an imitation learning strategy to augment a reasonable amount of pseudo APE training data, potentially preventing the model to overfit on the limited real training data and boosting the performance on held-out data.
no code implementations • WMT (EMNLP) 2020 • Jun Lu, Xin Ge, Yangbin Shi, Yuqi Zhang
In the filtering task, three main methods are applied to evaluate the quality of the parallel corpus, i. e. a) Dual Bilingual GPT-2 model, b) Dual Conditional Cross-Entropy Model and c) IBM word alignment model.
no code implementations • 16 Nov 2022 • Xin Ge, Ke Wang, Jiayi Wang, Nini Xiao, Xiangyu Duan, Yu Zhao, Yuqi Zhang
The leader board finally shows that our submissions are ranked first in three of four language directions in the Naive TS task of the WMT22 Translation Suggestion task.
1 code implementation • 14 Nov 2022 • Ke Wang, Xin Ge, Jiayi Wang, Yu Zhao, Yuqi Zhang
Human translators perform post editing on machine translations to correct errors in the scene of computer aided translation.