EMNLP 2018

Semi-Supervised Sequence Modeling with Cross-View Training

EMNLP 2018 tensorflow/models

We therefore propose Cross-View Training (CVT), a semi-supervised learning algorithm that improves the representations of a Bi-LSTM sentence encoder using a mix of labeled and unlabeled data.

CCG SUPERTAGGING DEPENDENCY PARSING MACHINE TRANSLATION MULTI-TASK LEARNING NAMED ENTITY RECOGNITION UNSUPERVISED REPRESENTATION LEARNING

Phrase-Based & Neural Unsupervised Machine Translation

EMNLP 2018 huggingface/pytorch-transformers

Machine translation systems achieve near human-level performance on some languages, yet their effectiveness strongly relies on the availability of large amounts of parallel sentences, which hinders their applicability to the majority of language pairs.

UNSUPERVISED MACHINE TRANSLATION

Understanding Back-Translation at Scale

EMNLP 2018 pytorch/fairseq

An effective method to improve neural machine translation with monolingual data is to augment the parallel training corpus with back-translations of target language sentences.

MACHINE TRANSLATION

Simple Recurrent Units for Highly Parallelizable Recurrence

EMNLP 2018 aymericdamien/TopDeepLearning

Common recurrent neural architectures scale poorly due to the intrinsic difficulty in parallelizing their state computations.

MACHINE TRANSLATION QUESTION ANSWERING TEXT CLASSIFICATION

Loss in Translation: Learning Bilingual Word Mapping with a Retrieval Criterion

EMNLP 2018 facebookresearch/MUSE

Continuous word representations learned separately on distinct languages can be aligned so that their words become comparable in a common space.

Non-Adversarial Unsupervised Word Translation

EMNLP 2018 facebookresearch/MUSE

We present a novel method that first aligns the second moment of the word distributions of the two languages and then iteratively refines the alignment.

XNLI: Evaluating Cross-lingual Sentence Representations

EMNLP 2018 facebookresearch/XLM

State-of-the-art natural language processing systems rely on supervision in the form of annotated data to learn competent models.

CROSS-LINGUAL NATURAL LANGUAGE INFERENCE MACHINE TRANSLATION

Improving the Transformer Translation Model with Document-Level Context

EMNLP 2018 thumt/THUMT

Although the Transformer translation model (Vaswani et al., 2017) has achieved state-of-the-art performance in a variety of translation tasks, how to use document-level context to deal with discourse phenomena problematic for Transformer still remains a challenge.

Unsupervised Statistical Machine Translation

EMNLP 2018 artetxem/vecmap

While modern machine translation has relied on large parallel corpora, a recent line of work has managed to train Neural Machine Translation (NMT) systems from monolingual corpora only (Artetxe et al., 2018c; Lample et al., 2018).

LANGUAGE MODELLING UNSUPERVISED MACHINE TRANSLATION

Bayesian Compression for Natural Language Processing

EMNLP 2018 ars-ashuha/variational-dropout-sparsifies-dnn

In natural language processing, a lot of the tasks are successfully solved with recurrent neural networks, but such models have a huge number of parameters.