no code implementations • IWSLT (EMNLP) 2018 • Dakun Zhang, Josep Crego, Jean Senellart
Knowledge distillation has recently been successfully applied to neural machine translation.
no code implementations • WS 2020 • Guillaume Klein, Dakun Zhang, Cl{\'e}ment Chouteau, Josep Crego, Jean Senellart
This paper describes the OpenNMT submissions to the WNGT 2020 efficiency shared task.
no code implementations • WS 2018 • Jean Senellart, Dakun Zhang, Bo wang, Guillaume Klein, Ramatch, Jean-Pierre irin, Josep Crego, Alex Rush, er
We present a system description of the OpenNMT Neural Machine Translation entry for the WNMT 2018 evaluation.
no code implementations • WS 2017 • Yongchao Deng, Jungi Kim, Guillaume Klein, Catherine Kobus, Natalia Segal, Christophe Servan, Bo wang, Dakun Zhang, Josep Crego, Jean Senellart
This paper describes SYSTRAN's systems submitted to the WMT 2017 shared news translation task for English-German, in both translation directions.
no code implementations • IJCNLP 2017 • Dakun Zhang, Jungi Kim, Josep Crego, Jean Senellart
Training efficiency is one of the main problems for Neural Machine Translation (NMT).
no code implementations • 18 Oct 2016 • Josep Crego, Jungi Kim, Guillaume Klein, Anabel Rebollo, Kathy Yang, Jean Senellart, Egor Akhanov, Patrice Brunelle, Aurelien Coquard, Yongchao Deng, Satoshi Enoue, Chiyo Geiss, Joshua Johanson, Ardas Khalsa, Raoum Khiari, Byeongil Ko, Catherine Kobus, Jean Lorieux, Leidiana Martins, Dang-Chuan Nguyen, Alexandra Priori, Thomas Riccardi, Natalia Segal, Christophe Servan, Cyril Tiquet, Bo wang, Jin Yang, Dakun Zhang, Jing Zhou, Peter Zoldan
Since the first online demonstration of Neural Machine Translation (NMT) by LISA, NMT development has recently moved from laboratory to production systems as demonstrated by several entities announcing roll-out of NMT engines to replace their existing technologies.