HLT 2016

A Diversity-Promoting Objective Function for Neural Conversation Models

HLT 2016 pender/chatbot-rnn

Sequence-to-sequence neural network models for generation of conversational responses tend to generate safe, commonplace responses (e.g., "I don't know") regardless of the input.

Learning to Compose Neural Networks for Question Answering

HLT 2016 jacobandreas/nmn2

We describe a question answering model that applies to both images and structured knowledge bases.

QUESTION ANSWERING

Neural Architectures for Named Entity Recognition

HLT 2016 marekrei/sequence-labeler

State-of-the-art named entity recognition systems rely heavily on hand-crafted features and domain-specific knowledge in order to learn effectively from the small, supervised training corpora that are available.

NAMED ENTITY RECOGNITION

Recurrent Neural Network Grammars

HLT 2016 clab/rnng

We introduce recurrent neural network grammars, probabilistic models of sentences with explicit phrase structure.

CONSTITUENCY PARSING LANGUAGE MODELLING

Learning Global Features for Coreference Resolution

HLT 2016 swiseman/nn_coref

There is compelling evidence that coreference prediction would benefit from modeling global information about entity-clusters.

COREFERENCE RESOLUTION

Top-down Tree Long Short-Term Memory Networks

HLT 2016 XingxingZhang/td-treelstm

Long Short-Term Memory (LSTM) networks, a type of recurrent neural network with a more complex computational unit, have been successfully applied to a variety of sequence modeling tasks.

DEPENDENCY PARSING

STransE: a novel embedding model of entities and relationships in knowledge bases

HLT 2016 datquocnguyen/STransE

Knowledge bases of real-world facts about entities and their relationships are useful resources for a variety of natural language processing tasks.

KNOWLEDGE BASE COMPLETION LINK PREDICTION