Machine Translation

1159 papers with code • 55 benchmarks • 52 datasets

Machine translation is the task of translating a sentence in a source language to a different target language

( Image credit: Google seq2seq )

Greatest papers with code

Memory Efficient Adaptive Optimization

google-research/google-research NeurIPS 2019

Adaptive gradient-based optimizers such as Adagrad and Adam are crucial for achieving state-of-the-art performance in machine translation and language modeling.

Language Modelling Machine Translation

Language Models not just for Pre-training: Fast Online Neural Noisy Channel Modeling

pytorch/fairseq 13 Nov 2020

Pre-training models on vast quantities of unlabeled data has emerged as an effective approach to improving accuracy on many NLP tasks.

 Ranked #1 on Machine Translation on WMT2016 Romanian-English (using extra training data)

Machine Translation

Deep Transformers with Latent Depth

pytorch/fairseq NeurIPS 2020

As an extension of this framework, we propose a novel method to train one shared Transformer network for multilingual machine translation with different layer selection posteriors for each language pair.

Language Modelling Machine Translation

Cross-lingual Retrieval for Iterative Self-Supervised Training

pytorch/fairseq NeurIPS 2020

Recent studies have demonstrated the cross-lingual alignment ability of multilingual pretrained language models.

Unsupervised Machine Translation

Neural Machine Translation with Byte-Level Subwords

pytorch/fairseq 7 Sep 2019

Representing text at the level of bytes and using the 256 byte set as vocabulary is a potential solution to this issue.

Machine Translation

Unsupervised Quality Estimation for Neural Machine Translation

pytorch/fairseq 21 May 2020

Quality Estimation (QE) is an important component in making Machine Translation (MT) useful in real-world applications, as it is aimed to inform the user on the quality of the MT output at test time.

Machine Translation

Monotonic Multihead Attention

pytorch/fairseq ICLR 2020

Simultaneous machine translation models start generating a target sequence before they have encoded or read the source sequence.

Machine Translation

Reducing Transformer Depth on Demand with Structured Dropout

pytorch/fairseq ICLR 2020

Overparameterized transformer networks have obtained state of the art results in various natural language processing tasks, such as machine translation, language modeling, and question answering.

Language Modelling Machine Translation +1

Jointly Learning to Align and Translate with Transformer Models

pytorch/fairseq IJCNLP 2019

The state of the art in machine translation (MT) is governed by neural approaches, which typically provide superior translation accuracy over statistical approaches.

Machine Translation Word Alignment

Simple and Effective Noisy Channel Modeling for Neural Machine Translation

pytorch/fairseq IJCNLP 2019

Previous work on neural noisy channel modeling relied on latent variable models that incrementally process the source and target sentence.

Latent Variable Models Machine Translation