Search Results for author: Max Meng

Found 4 papers, 0 papers with code

On the Relationship between Neural Machine Translation and Word Alignment

no code implementations Xintong Li, Lemao Liu, Guanlin Li, Max Meng, Shuming Shi

We find that although NMT models are difficult to capture word alignment for CFT words but these words do not sacrifice translation quality significantly, which provides an explanation why NMT is more successful for translation yet worse for word alignment compared to statistical machine translation.

Machine Translation NMT +2

Regularized Context Gates on Transformer for Machine Translation

no code implementations ACL 2020 Xintong Li, Lemao Liu, Rui Wang, Guoping Huang, Max Meng

This paper first provides a method to identify source and target contexts and then introduce a gate mechanism to control the source and target contributions in Transformer.

Machine Translation NMT +1

On the Word Alignment from Neural Machine Translation

no code implementations ACL 2019 Xintong Li, Guanlin Li, Lemao Liu, Max Meng, Shuming Shi

Prior researches suggest that neural machine translation (NMT) captures word alignment through its attention mechanism, however, this paper finds attention may almost fail to capture word alignment for some NMT models.

Machine Translation NMT +2

Target Foresight Based Attention for Neural Machine Translation

no code implementations NAACL 2018 Xintong Li, Lemao Liu, Zhaopeng Tu, Shuming Shi, Max Meng

In neural machine translation, an attention model is used to identify the aligned source words for a target word (target foresight word) in order to select translation context, but it does not make use of any information of this target foresight word at all.

Language Modelling Machine Translation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.