JASS: Japanese-specific Sequence to Sequence Pre-training for Neural Machine Translation

LREC 2020 Zhuoyuan MaoFabien CromieresRaj DabreHaiyue SongSadao Kurohashi

Neural machine translation (NMT) needs large parallel corpora for state-of-the-art translation quality. Low-resource NMT is typically addressed by transfer learning which leverages large monolingual or parallel corpora for pre-training... (read more)

PDF Abstract

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.