no code implementations • EMNLP 2021 • Xinglin Lyu, Junhui Li, ZhengXian Gong, Min Zhang
In this paper we apply “one translation per discourse” in NMT, and aim to encourage lexical translation consistency for document-level NMT.
no code implementations • 23 Feb 2024 • Xinglin Lyu, Junhui Li, Yanqing Zhao, Daimeng Wei, Shimin Tao, Hao Yang, Min Zhang
In this paper, we propose an alternative adaptation approach, named Decoding-enhanced Multi-phase Prompt Tuning (DeMPT), to make LLMs discriminately model and utilize the inter- and intra-sentence context and more effectively adapt LLMs to context-aware NMT.