Search Results for author: Changfeng Zhu

Found 4 papers, 0 papers with code

Towards Reliable Neural Machine Translation with Consistency-Aware Meta-Learning

no code implementations20 Mar 2023 Rongxiang Weng, Qiang Wang, Wensen Cheng, Changfeng Zhu, Min Zhang

A contributing factor to this problem is that NMT models trained with the one-to-one paradigm struggle to handle the source diversity phenomenon, where inputs with the same meaning can be expressed differently.

Bilevel Optimization Diversity +5

Language-aware Interlingua for Multilingual Neural Machine Translation

no code implementations ACL 2020 Changfeng Zhu, Heng Yu, Shanbo Cheng, Weihua Luo

However, the traditional multilingual model fails to capture the diversity and specificity of different languages, resulting in inferior performance compared with individual models that are sufficiently trained.

Decoder Diversity +4

AR: Auto-Repair the Synthetic Data for Neural Machine Translation

no code implementations5 Apr 2020 Shanbo Cheng, Shaohui Kuang, Rongxiang Weng, Heng Yu, Changfeng Zhu, Weihua Luo

Compared with only using limited authentic parallel data as training corpus, many studies have proved that incorporating synthetic parallel data, which generated by back translation (BT) or forward translation (FT, or selftraining), into the NMT training process can significantly improve translation quality.

de-en Machine Translation +3

Cannot find the paper you are looking for? You can Submit a new open access paper.