EMNLP 2015

A Neural Attention Model for Abstractive Sentence Summarization

EMNLP 2015 tensorflow/models

Summarization based on text extraction is inherently limited, but generation-style abstractive methods have proven challenging to build.

ABSTRACTIVE SENTENCE SUMMARIZATION

Effective Approaches to Attention-based Neural Machine Translation

EMNLP 2015 facebookresearch/fairseq-py

Our ensemble model using different attention architectures has established a new state-of-the-art result in the WMT'15 English to German translation task with 25. 9 BLEU points, an improvement of 1. 0 BLEU points over the existing best system backed by NMT and an n-gram reranker.

 SOTA for Machine Translation on 20NEWS (Accuracy metric )

MACHINE TRANSLATION

Improved Transition-Based Parsing by Modeling Characters instead of Words with LSTMs

EMNLP 2015 clab/lstm-parser

We present extensions to a continuous-state dependency parsing method that makes it applicable to morphologically rich languages.

DEPENDENCY PARSING

A Generative Word Embedding Model and its Low Rank Positive Semidefinite Solution

EMNLP 2015 askerlee/topicvec

Most existing word embedding methods can be categorized into Neural Embedding Models and Matrix Factorization (MF)-based methods.

Finding Function in Form: Compositional Character Models for Open Vocabulary Word Representation

EMNLP 2015 wlin12/JNN

We introduce a model for constructing vector representations of words by composing characters using bidirectional LSTMs.

LANGUAGE MODELLING PART-OF-SPEECH TAGGING