About

Machine translation is the task of translating a sentence in a source language to a different target language

( Image credit: Google seq2seq )

Benchmarks

TREND DATASET BEST METHOD PAPER TITLE PAPER CODE COMPARE

Subtasks

Datasets

Greatest papers with code

Semi-Supervised Sequence Modeling with Cross-View Training

EMNLP 2018 tensorflow/models

We therefore propose Cross-View Training (CVT), a semi-supervised learning algorithm that improves the representations of a Bi-LSTM sentence encoder using a mix of labeled and unlabeled data.

CCG SUPERTAGGING DEPENDENCY PARSING MACHINE TRANSLATION MULTI-TASK LEARNING NAMED ENTITY RECOGNITION PART-OF-SPEECH TAGGING UNSUPERVISED REPRESENTATION LEARNING

Can Active Memory Replace Attention?

NeurIPS 2016 tensorflow/models

Several mechanisms to focus attention of a neural network on selected parts of its input or memory have been used successfully in deep learning models in recent years.

IMAGE CAPTIONING MACHINE TRANSLATION

Exploiting Similarities among Languages for Machine Translation

17 Sep 2013tensorflow/models

Dictionaries and phrase tables are the basis of modern statistical machine translation systems.

MACHINE TRANSLATION

Attention Is All You Need

NeurIPS 2017 tensorflow/models

The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder-decoder configuration.

ABSTRACTIVE TEXT SUMMARIZATION CONSTITUENCY PARSING MACHINE TRANSLATION

Pre-trained Summarization Distillation

24 Oct 2020huggingface/transformers

A third, simpler approach is to 'shrink and fine-tune' (SFT), which avoids any explicit distillation by copying parameters to a smaller student model and then fine-tuning.

KNOWLEDGE DISTILLATION MACHINE TRANSLATION

Multilingual Denoising Pre-training for Neural Machine Translation

22 Jan 2020huggingface/transformers

This paper demonstrates that multilingual denoising pre-training produces significant performance gains across a wide variety of machine translation (MT) tasks.

DENOISING UNSUPERVISED MACHINE TRANSLATION

BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension

ACL 2020 huggingface/transformers

We evaluate a number of noising approaches, finding the best performance by both randomly shuffling the order of the original sentences and using a novel in-filling scheme, where spans of text are replaced with a single mask token.

ABSTRACTIVE TEXT SUMMARIZATION DENOISING MACHINE TRANSLATION NATURAL LANGUAGE INFERENCE QUESTION ANSWERING TEXT GENERATION