EACL 2017

Neural Semantic Encoders

EACL 2017 Smerity/keras_snli

We present a memory augmented neural network for natural language understanding: Neural Semantic Encoders.

MACHINE TRANSLATION NATURAL LANGUAGE INFERENCE QUESTION ANSWERING SENTENCE CLASSIFICATION SENTIMENT ANALYSIS

Nematus: a Toolkit for Neural Machine Translation

EACL 2017 MarcBS/keras

We present Nematus, a toolkit for Neural Machine Translation.

MACHINE TRANSLATION

Using the Output Embedding to Improve Language Models

EACL 2017 lium-lst/nmtpy

We study the topmost weight matrix of neural network language models.

JFLEG: A Fluency Corpus and Benchmark for Grammatical Error Correction

EACL 2017 keisks/jfleg

We present a new parallel corpus, JHU FLuency-Extended GUG corpus (JFLEG) for developing and evaluating grammatical error correction (GEC).

GRAMMATICAL ERROR CORRECTION

Identifying beneficial task relations for multi-task learning in deep neural networks

EACL 2017 jbingel/eacl2017_mtl

Multi-task learning (MTL) in deep neural networks for NLP has recently received increasing interest due to some compelling benefits, including its potential to efficiently regularize models and to reduce the need for labeled data.

MULTI-TASK LEARNING

Hypernyms under Siege: Linguistically-motivated Artillery for Hypernymy Detection

EACL 2017 vered1986/UnsupervisedHypernymy

The fundamental role of hypernymy in NLP has motivated the development of many methods for the automatic identification of this relation, most of which rely on word distribution.

HYPERNYM DISCOVERY