Browse > Natural Language Processing > Machine Translation

Machine Translation

451 papers with code · Natural Language Processing

Machine translation is the task of translating a sentence in a source language to a different target language.

State-of-the-art leaderboards

Trend Dataset Best Method Paper title Paper Code Compare

Greatest papers with code

Can Active Memory Replace Attention?

NeurIPS 2016 tensorflow/models

Several mechanisms to focus attention of a neural network on selected parts of its input or memory have been used successfully in deep learning models in recent years.

IMAGE CAPTIONING MACHINE TRANSLATION

Exploiting Similarities among Languages for Machine Translation

17 Sep 2013tensorflow/models

Dictionaries and phrase tables are the basis of modern statistical machine translation systems.

MACHINE TRANSLATION

Semi-Supervised Sequence Modeling with Cross-View Training

EMNLP 2018 tensorflow/models

We therefore propose Cross-View Training (CVT), a semi-supervised learning algorithm that improves the representations of a Bi-LSTM sentence encoder using a mix of labeled and unlabeled data.

CCG SUPERTAGGING DEPENDENCY PARSING MACHINE TRANSLATION MULTI-TASK LEARNING NAMED ENTITY RECOGNITION UNSUPERVISED REPRESENTATION LEARNING

Language Models are Unsupervised Multitask Learners

Preprint 2019 huggingface/pytorch-pretrained-BERT

Natural language processing tasks, such as question answering, machine translation, reading comprehension, and summarization, are typically approached with supervised learning on taskspecific datasets.

 SOTA for Language Modelling on Text8 (using extra training data)

COMMON SENSE REASONING DOCUMENT SUMMARIZATION LANGUAGE MODELLING MACHINE TRANSLATION QUESTION ANSWERING READING COMPREHENSION

Cross-lingual Language Model Pretraining

22 Jan 2019huggingface/pytorch-pretrained-BERT

On unsupervised machine translation, we obtain 34. 3 BLEU on WMT'16 German-English, improving the previous state of the art by more than 9 BLEU.

LANGUAGE MODELLING UNSUPERVISED MACHINE TRANSLATION

The Evolved Transformer

30 Jan 2019tensorflow/tensor2tensor

Recent works have highlighted the strength of the Transformer architecture on sequence tasks while, at the same time, neural architecture search (NAS) has begun to outperform human-designed models.

MACHINE TRANSLATION NEURAL ARCHITECTURE SEARCH

Universal Transformers

ICLR 2019 tensorflow/tensor2tensor

Feed-forward and convolutional architectures have recently been shown to achieve superior results on some sequence modeling tasks such as machine translation, with the added advantage that they concurrently process all inputs in the sequence, leading to easy parallelization and faster training times.

LANGUAGE MODELLING LEARNING TO EXECUTE MACHINE TRANSLATION

Training Tips for the Transformer Model

1 Apr 2018tensorflow/tensor2tensor

This article describes our experiments in neural machine translation using the recent Tensor2Tensor framework and the Transformer sequence-to-sequence model (Vaswani et al., 2017).

MACHINE TRANSLATION

Tensor2Tensor for Neural Machine Translation

WS 2018 tensorflow/tensor2tensor

Tensor2Tensor is a library for deep learning models that is well-suited for neural machine translation and includes the reference implementation of the state-of-the-art Transformer model.

MACHINE TRANSLATION

Self-Attention with Relative Position Representations

NAACL 2018 tensorflow/tensor2tensor

On the WMT 2014 English-to-German and English-to-French translation tasks, this approach yields improvements of 1. 3 BLEU and 0. 3 BLEU over absolute position representations, respectively.

MACHINE TRANSLATION