Improving Zero-shot Translation with Language-Independent Constraints

WS 2019 Ngoc-Quan PhamJan NiehuesThanh-Le HaAlex Waibel

An important concern in training multilingual neural machine translation (NMT) is to translate between language pairs unseen during training, i.e zero-shot translation. Improving this ability kills two birds with one stone by providing an alternative to pivot translation which also allows us to better understand how the model captures information between languages... (read more)

PDF Abstract WS 2019 PDF WS 2019 Abstract

Code


No code implementations yet. Submit your code now

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper