Multi-Domain Neural Machine Translation with Word-Level Domain Context Discrimination

EMNLP 2018 Jiali ZengJinsong SuHuating WenYang LiuJun XieYongjing YinJianqiang Zhao

With great practical value, the study of Multi-domain Neural Machine Translation (NMT) mainly focuses on using mixed-domain parallel sentences to construct a unified model that allows translation to switch between different domains. Intuitively, words in a sentence are related to its domain to varying degrees, so that they will exert disparate impacts on the multi-domain NMT modeling... (read more)

PDF Abstract

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.