Machine Translation

1155 papers with code • 55 benchmarks • 51 datasets

Machine translation is the task of translating a sentence in a source language to a different target language

( Image credit: Google seq2seq )

Greatest papers with code

Semi-Supervised Sequence Modeling with Cross-View Training

tensorflow/models EMNLP 2018

We therefore propose Cross-View Training (CVT), a semi-supervised learning algorithm that improves the representations of a Bi-LSTM sentence encoder using a mix of labeled and unlabeled data.

CCG Supertagging Dependency Parsing +5

Can Active Memory Replace Attention?

tensorflow/models NeurIPS 2016

Several mechanisms to focus attention of a neural network on selected parts of its input or memory have been used successfully in deep learning models in recent years.

Image Captioning Machine Translation

Exploiting Similarities among Languages for Machine Translation

tensorflow/models 17 Sep 2013

Dictionaries and phrase tables are the basis of modern statistical machine translation systems.

Machine Translation

Attention Is All You Need

tensorflow/models NeurIPS 2017

The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder-decoder configuration.

Abstractive Text Summarization Constituency Parsing +1

Beyond English-Centric Multilingual Machine Translation

huggingface/transformers 21 Oct 2020

Existing work in translation demonstrated the potential of massively multilingual machine translation by training a single model able to translate between any pair of languages.

Machine Translation

Multilingual Translation with Extensible Multilingual Pretraining and Finetuning

huggingface/transformers 2 Aug 2020

Recent work demonstrates the potential of multilingual pretraining of creating one model that can be used for various tasks in different languages.

Machine Translation

Multilingual Denoising Pre-training for Neural Machine Translation

huggingface/transformers 22 Jan 2020

This paper demonstrates that multilingual denoising pre-training produces significant performance gains across a wide variety of machine translation (MT) tasks.

Denoising Document-level +1

BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension

huggingface/transformers ACL 2020

We evaluate a number of noising approaches, finding the best performance by both randomly shuffling the order of the original sentences and using a novel in-filling scheme, where spans of text are replaced with a single mask token.

Abstractive Text Summarization Denoising +4