Browse > Natural Language Processing > Machine Translation

Machine Translation

328 papers with code · Natural Language Processing

Machine translation is the task of translating a sentence in a source language to a different target language.

State-of-the-art leaderboards

Trend Dataset Best Method Paper title Paper Code Compare

Latest papers with code

Language Models are Unsupervised Multitask Learners

Preprint 2019 openai/gpt-2

Natural language processing tasks, such as question answering, machine translation, reading comprehension, and summarization, are typically approached with supervised learning on taskspecific datasets. We demonstrate that language models begin to learn these tasks without any explicit supervision when trained on a new dataset of millions of webpages called WebText.

COMMON SENSE REASONING DOCUMENT SUMMARIZATION LANGUAGE MODELLING MACHINE TRANSLATION QUESTION ANSWERING READING COMPREHENSION

Multilingual Neural Machine Translation With Soft Decoupled Encoding

ICLR 2019 cindyxinyiwang/SDE

Multilingual training of neural machine translation (NMT) systems has led to impressive accuracy improvements on low-resource languages. However, there are still significant challenges in efficiently learning word representations in the face of paucity of data.

MACHINE TRANSLATION

09 Feb 2019

Two New Evaluation Datasets for Low-Resource Machine Translation: Nepali-English and Sinhala-English

4 Feb 2019facebookresearch/flores

Besides the technical challenges of learning with limited supervision, there is also another challenge: it is very difficult to evaluate methods trained on low resource language pairs because there are very few freely and publicly available benchmarks. These are languages with very different morphology and syntax, for which little out-of-domain parallel data is available and for which relatively large amounts of monolingual data are freely available.

MACHINE TRANSLATION

04 Feb 2019

Unsupervised Clinical Language Translation

4 Feb 2019ckbjimmy/p2c

As patients' access to their doctors' clinical notes becomes common, translating professional, clinical jargon to layperson-understandable language is essential to improve patient-clinician communication. Such translation yields better clinical outcomes by enhancing patients' understanding of their own health conditions, and thus improving patients' involvement in their own care.

CLINICAL LANGUAGE TRANSLATION REPRESENTATION LEARNING

04 Feb 2019

The Evolved Transformer

30 Jan 2019tensorflow/tensor2tensor

Recent works have highlighted the strengths of the Transformer architecture for dealing with sequence tasks. At the same time, neural architecture search has advanced to the point where it can outperform human-designed models.

ARCHITECTURE SEARCH MACHINE TRANSLATION

Pay Less Attention with Lightweight and Dynamic Convolutions

ICLR 2019 pytorch/fairseq

Self-attention is a useful mechanism to build generative models for language and images. We predict separate convolution kernels based solely on the current time-step in order to determine the importance of context elements.

ABSTRACTIVE TEXT SUMMARIZATION LANGUAGE MODELLING MACHINE TRANSLATION

Fixup Initialization: Residual Learning Without Normalization

ICLR 2019 valilenk/fixup

Normalization layers are a staple in state-of-the-art deep neural network architectures. They are widely believed to stabilize training, enable higher learning rate, accelerate convergence and improve generalization, though the reason for their effectiveness is still an active research topic.

IMAGE CLASSIFICATION MACHINE TRANSLATION

27 Jan 2019

Cross-lingual Language Model Pretraining

22 Jan 2019facebookresearch/XLM

On unsupervised machine translation, we obtain 34.3 BLEU on WMT'16 German-English, improving the previous state of the art by more than 9 BLEU. On supervised machine translation, we obtain a new state of the art of 38.5 BLEU on WMT'16 Romanian-English, outperforming the previous best approach by more than 4 BLEU.

LANGUAGE MODELLING UNSUPERVISED MACHINE TRANSLATION

Unsupervised Neural Machine Translation with SMT as Posterior Regularization

14 Jan 2019Imagist-Shuo/UNMT-SPR

Without real bilingual corpus available, unsupervised Neural Machine Translation (NMT) typically requires pseudo parallel data generated with the back-translation method for the model training. To address this issue, we introduce phrase based Statistic Machine Translation (SMT) models which are robust to noisy data, as posterior regularizations to guide the training of unsupervised NMT models in the iterative back-translation process.

UNSUPERVISED MACHINE TRANSLATION

14 Jan 2019

Choosing the Right Word: Using Bidirectional LSTM Tagger for Writing Support Systems

8 Jan 2019vicmak/Exploiting-BiLSTM-for-Proper-Word-Choice

We use a bidirectional Recurrent Neural Network (RNN) with LSTM for learning the proper word choice based on a word's sentential context. We demonstrate and evaluate our application on both a domain-specific (scientific), writing task and a general-purpose writing task.

GRAMMATICAL ERROR CORRECTION MACHINE TRANSLATION

08 Jan 2019