no code implementations • CCL 2021 • Li Feiyu, Zhao Yahui, Yang Feiyang, Cui Rongyi
Experimental results show that our work is superior to the baselines in Chinese-Korean and Korean-Chinese translation tasks which fully certificates the effectiveness of our method.”
no code implementations • CCL 2020 • Yang Feiyang, Zhao Yahui, Cui Rongyi
The results show that the model can identify the important words in Korean instead of manual annotation for representation learning.
no code implementations • CCL 2022 • Jiang Kexin, Zhao Yahui, Cui Rongyi
In the interaction layer, we initially fuse the information of the sentence pairs to obtain low-level semantic information; at the same time, we use the bi-directional attention in the machine reading comprehension model and self-attention to obtain the high-level semantic information.