EMNLP 2016

Sequence-to-Sequence Learning as Beam-Search Optimization

EMNLP 2016 pytorch/fairseq

In this work, we introduce a model and beam-search training scheme, based on the work of Daume III and Marcu (2005), that extends seq2seq to learn global sequence scores.

LANGUAGE MODELLING MACHINE TRANSLATION TEXT GENERATION

Key-Value Memory Networks for Directly Reading Documents

EMNLP 2016 facebookresearch/ParlAI

Directly reading documents and being able to answer questions from them is an unsolved challenge.

QUESTION ANSWERING

Sequence-Level Knowledge Distillation

EMNLP 2016 harvardnlp/seq2seq-attn

We demonstrate that standard knowledge distillation applied to word-level prediction can be effective for NMT, and also introduce two novel sequence-level versions of knowledge distillation that further improve performance, and somewhat surprisingly, seem to eliminate the need for beam search (even when applied on the original teacher model).

MACHINE TRANSLATION

Aspect Level Sentiment Classification with Deep Memory Network

EMNLP 2016 songyouwei/ABSA-PyTorch

Such importance degree and text representation are calculated with multiple computational layers, each of which is a neural attention model over an external memory.

ASPECT-BASED SENTIMENT ANALYSIS

SQuAD: 100,000+ Questions for Machine Comprehension of Text

EMNLP 2016 HKUST-KnowComp/R-Net

We present the Stanford Question Answering Dataset (SQuAD), a new reading comprehension dataset consisting of 100, 000+ questions posed by crowdworkers on a set of Wikipedia articles, where the answer to each question is a segment of text from the corresponding reading passage.

QUESTION ANSWERING READING COMPREHENSION

Neural Generation of Regular Expressions from Natural Language with Minimal Domain Knowledge

EMNLP 2016 nicholaslocascio/deep-regex

This paper explores the task of translating natural language queries into regular expressions which embody their meaning.