1 code implementation • EAMT 2020 • Felix Hieber, Tobias Domhan, Michael Denkowski, David Vilar
We present Sockeye 2, a modernized and streamlined version of the Sockeye neural machine translation (NMT) toolkit.
2 code implementations • 12 Jul 2022 • Felix Hieber, Michael Denkowski, Tobias Domhan, Barbara Darques Barros, Celina Dong Ye, Xing Niu, Cuong Hoang, Ke Tran, Benjamin Hsu, Maria Nadejde, Surafel Lakew, Prashant Mathur, Anna Currey, Marcello Federico
When running comparable models, Sockeye 3 is up to 126% faster than other PyTorch implementations on GPUs and up to 292% faster on CPUs.
1 code implementation • AMTA 2020 • Tobias Domhan, Michael Denkowski, David Vilar, Xing Niu, Felix Hieber, Kenneth Heafield
We present Sockeye 2, a modernized and streamlined version of the Sockeye neural machine translation (NMT) toolkit.
no code implementations • WS 2018 • Xing Niu, Michael Denkowski, Marine Carpuat
Despite impressive progress in high-resource settings, Neural Machine Translation (NMT) still struggles in low-resource and out-of-domain scenarios, often failing to match the quality of phrase-based translation.
16 code implementations • 15 Dec 2017 • Felix Hieber, Tobias Domhan, Michael Denkowski, David Vilar, Artem Sokolov, Ann Clifton, Matt Post
Written in Python and built on MXNet, the toolkit offers scalable training and inference for the three most prominent encoder-decoder architectures: attentional recurrent neural networks, self-attentional transformers, and fully convolutional networks.
1 code implementation • WS 2017 • Michael Denkowski, Graham Neubig
As a result, it is often difficult to determine whether improvements from research will carry over to systems deployed for real-world use.