Browse > Natural Language Processing > Machine Translation

Machine Translation

484 papers with code · Natural Language Processing

Machine translation is the task of translating a sentence in a source language to a different target language

State-of-the-art leaderboards

Trend Dataset Best Method Paper title Paper Code Compare

Latest papers without code

Monotonic Multihead Attention

ICLR 2020

Simultaneous machine translation models start generating a target sequence before they have encoded or read the source sequence.

MACHINE TRANSLATION

Neural Phrase-to-Phrase Machine Translation

ICLR 2020

We present Neural Phrase-to-Phrase Machine Translation (\nppmt), a phrase-based translation model that uses a novel phrase-attention mechanism to discover relevant input (source) segments to generate output (target) phrases.

MACHINE TRANSLATION

In-training Matrix Factorization for Parameter-frugal Neural Machine Translation

ICLR 2020

In this paper, we propose the use of in-training matrix factorization to reduce the model size for neural machine translation.

MACHINE TRANSLATION

Improved Training Techniques for Online Neural Machine Translation

ICLR 2020

We investigate the sensitivity of such models to the value of k that is used during training and when deploying the model, and the effect of updating the hidden states in transformer models as new source tokens are read.

MACHINE TRANSLATION SPEECH RECOGNITION

Compositional Continual Language Learning

ICLR 2020

Experimental results show that the proposed method has significant improvement over state of the art methods, and it enables knowledge transfer and prevents catastrophic forgetting, resulting in more than 85% accuracy up to 100 stages, compared with less 50% accuracy for baselines.

MACHINE TRANSLATION TRANSFER LEARNING

Efficient Transformer for Mobile Applications

ICLR 2020

It outperforms the transformer by 0. 9 BLEU under 500M Mult-Adds and 1. 1 BLEU under 100M Mult-Adds on WMT'14 English-German.

AUTOML MACHINE TRANSLATION QUESTION ANSWERING

Resolving Lexical Ambiguity in English–Japanese Neural Machine Translation

ICLR 2020

Lexical ambiguity, i. e., the presence of two or more meanings for a single word, is an inherent and challenging problem for machine translation systems.

LANGUAGE MODELLING MACHINE TRANSLATION WORD EMBEDDINGS

Residual Energy-Based Models for Text Generation

ICLR 2020

In this work, we investigate un-normalized energy-based models (EBMs) which operate not at the token but at the sequence level.

LANGUAGE MODELLING MACHINE TRANSLATION TEXT GENERATION

Multichannel Generative Language Models

ICLR 2020

For conditional generation, the model is given a fully observed channel, and generates the k-1 channels in parallel.

MACHINE TRANSLATION

Self-Induced Curriculum Learning in Neural Machine Translation

ICLR 2020

Self-supervised neural machine translation (SS-NMT) learns how to extract/select suitable training data from comparable (rather than parallel) corpora and how to translate, in a way that the two tasks support each other in a virtuous circle.

DENOISING MACHINE TRANSLATION