no code implementations • IWSLT 2017 • Raj Dabre, Fabien Cromieres, Sadao Kurohashi
We describe here our Machine Translation (MT) model and the results we obtained for the IWSLT 2017 Multilingual Shared Task.
1 code implementation • LREC 2020 • Zhuoyuan Mao, Fabien Cromieres, Raj Dabre, Haiyue Song, Sadao Kurohashi
Monolingual pre-training approaches such as MASS (MAsked Sequence to Sequence) are extremely effective in boosting NMT quality for languages with small parallel corpora.
no code implementations • WS 2019 • Fabien Cromieres, Sadao Kurohashi
We describe here the experiments we did for the the news translation shared task of WMT 2019.
1 code implementation • WS 2017 • Fabien Cromieres, Raj Dabre, Toshiaki Nakazawa, Sadao Kurohashi
We describe here our approaches and results on the WAT 2017 shared translation tasks.
no code implementations • IJCNLP 2017 • Fabien Cromieres, Toshiaki Nakazawa, Raj Dabre
Machine Translation (MT) is a sub-field of NLP which has experienced a number of paradigm shifts since its inception.
no code implementations • MTSummit 2017 • Raj Dabre, Fabien Cromieres, Sadao Kurohashi
In this paper, we explore a simple solution to "Multi-Source Neural Machine Translation" (MSNMT) which only relies on preprocessing a N-way multilingual corpus without modifying the Neural Machine Translation (NMT) architecture or training procedure.
1 code implementation • WS 2016 • Fabien Cromieres, Chenhui Chu, Toshiaki Nakazawa, Sadao Kurohashi
We report very good translation results, especially when using neural MT for Chinese-to-Japanese translation.