Browse > Natural Language Processing > Machine Translation

Machine Translation

531 papers with code · Natural Language Processing

Machine translation is the task of translating a sentence in a source language to a different target language

( Image credit: Google seq2seq )

Leaderboards

TREND DATASET BEST METHOD PAPER TITLE PAPER CODE COMPARE

Greatest papers with code

Semi-Supervised Sequence Modeling with Cross-View Training

EMNLP 2018 tensorflow/models

We therefore propose Cross-View Training (CVT), a semi-supervised learning algorithm that improves the representations of a Bi-LSTM sentence encoder using a mix of labeled and unlabeled data.

CCG SUPERTAGGING DEPENDENCY PARSING MACHINE TRANSLATION MULTI-TASK LEARNING NAMED ENTITY RECOGNITION UNSUPERVISED REPRESENTATION LEARNING

Attention Is All You Need

NeurIPS 2017 tensorflow/models

The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder-decoder configuration.

CONSTITUENCY PARSING MACHINE TRANSLATION

Can Active Memory Replace Attention?

NeurIPS 2016 tensorflow/models

Several mechanisms to focus attention of a neural network on selected parts of its input or memory have been used successfully in deep learning models in recent years.

IMAGE CAPTIONING MACHINE TRANSLATION

Exploiting Similarities among Languages for Machine Translation

17 Sep 2013tensorflow/models

Dictionaries and phrase tables are the basis of modern statistical machine translation systems.

MACHINE TRANSLATION

Language Models are Unsupervised Multitask Learners

Preprint 2019 huggingface/pytorch-transformers

Natural language processing tasks, such as question answering, machine translation, reading comprehension, and summarization, are typically approached with supervised learning on taskspecific datasets.

 SOTA for Language Modelling on Text8 (using extra training data)

COMMON SENSE REASONING DOCUMENT SUMMARIZATION LANGUAGE MODELLING MACHINE TRANSLATION QUESTION ANSWERING READING COMPREHENSION TEXT GENERATION

Phrase-Based & Neural Unsupervised Machine Translation

EMNLP 2018 huggingface/pytorch-transformers

Machine translation systems achieve near human-level performance on some languages, yet their effectiveness strongly relies on the availability of large amounts of parallel sentences, which hinders their applicability to the majority of language pairs.

UNSUPERVISED MACHINE TRANSLATION

Cross-lingual Language Model Pretraining

NeurIPS 2019 huggingface/transformers

On unsupervised machine translation, we obtain 34. 3 BLEU on WMT'16 German-English, improving the previous state of the art by more than 9 BLEU.

LANGUAGE MODELLING UNSUPERVISED MACHINE TRANSLATION

The Evolved Transformer

30 Jan 2019tensorflow/tensor2tensor

Recent works have highlighted the strength of the Transformer architecture on sequence tasks while, at the same time, neural architecture search (NAS) has begun to outperform human-designed models.

MACHINE TRANSLATION NEURAL ARCHITECTURE SEARCH

Universal Transformers

ICLR 2019 tensorflow/tensor2tensor

Feed-forward and convolutional architectures have recently been shown to achieve superior results on some sequence modeling tasks such as machine translation, with the added advantage that they concurrently process all inputs in the sequence, leading to easy parallelization and faster training times.

LANGUAGE MODELLING LEARNING TO EXECUTE MACHINE TRANSLATION

Training Tips for the Transformer Model

1 Apr 2018tensorflow/tensor2tensor

This article describes our experiments in neural machine translation using the recent Tensor2Tensor framework and the Transformer sequence-to-sequence model (Vaswani et al., 2017).

MACHINE TRANSLATION