NMT

420 papers with code • 1 benchmarks • 1 datasets

Neural machine translation is an approach to machine translation that uses an artificial neural network to predict the likelihood of a sequence of words, typically modeling entire sentences in a single integrated model.

Libraries

Use these libraries to find NMT models and implementations
8 papers
1,149
3 papers
6,550
See all 10 libraries.

Datasets


Most implemented papers

Effective Approaches to Attention-based Neural Machine Translation

facebookresearch/fairseq EMNLP 2015

Our ensemble model using different attention architectures has established a new state-of-the-art result in the WMT'15 English to German translation task with 25. 9 BLEU points, an improvement of 1. 0 BLEU points over the existing best system backed by NMT and an n-gram reranker.

Neural Machine Translation of Rare Words with Subword Units

rsennrich/subword-nmt ACL 2016

Neural machine translation (NMT) models typically operate with a fixed vocabulary, but translation is an open-vocabulary problem.

Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation

google/sentencepiece 26 Sep 2016

To improve parallelism and therefore decrease training time, our attention mechanism connects the bottom layer of the decoder to the top layer of the encoder.

Sockeye: A Toolkit for Neural Machine Translation

awslabs/sockeye 15 Dec 2017

Written in Python and built on MXNet, the toolkit offers scalable training and inference for the three most prominent encoder-decoder architectures: attentional recurrent neural networks, self-attentional transformers, and fully convolutional networks.

Phrase-Based & Neural Unsupervised Machine Translation

facebookresearch/UnsupervisedMT EMNLP 2018

Machine translation systems achieve near human-level performance on some languages, yet their effectiveness strongly relies on the availability of large amounts of parallel sentences, which hinders their applicability to the majority of language pairs.

Massive Exploration of Neural Machine Translation Architectures

google/seq2seq EMNLP 2017

Neural Machine Translation (NMT) has shown remarkable progress over the past few years with production systems now being deployed to end-users.

OpenNMT: Neural Machine Translation Toolkit

rsennrich/nematus WS 2018

OpenNMT is an open-source toolkit for neural machine translation (NMT).

Joey NMT: A Minimalist NMT Toolkit for Novices

joeynmt/joeynmt IJCNLP 2019

We present Joey NMT, a minimalist neural machine translation toolkit based on PyTorch that is specifically designed for novices.

Sequence-Level Knowledge Distillation

harvardnlp/seq2seq-attn EMNLP 2016

We demonstrate that standard knowledge distillation applied to word-level prediction can be effective for NMT, and also introduce two novel sequence-level versions of knowledge distillation that further improve performance, and somewhat surprisingly, seem to eliminate the need for beam search (even when applied on the original teacher model).

code2seq: Generating Sequences from Structured Representations of Code

tech-srl/code2seq ICLR 2019

The ability to generate natural language sequences from source code snippets has a variety of applications such as code summarization, documentation, and retrieval.