HLT 2016

Learning to Compose Neural Networks for Question Answering

HLT 2016 jacobandreas/nmn2

We describe a question answering model that applies to both images and structured knowledge bases.

QUESTION ANSWERING

Learning Natural Language Inference with LSTM

HLT 2016 shuohangwang/SeqMatchSeq

On the SNLI corpus, our model achieves an accuracy of 86. 1%, outperforming the state of the art.

NATURAL LANGUAGE INFERENCE SENTENCE EMBEDDINGS

Neural Architectures for Named Entity Recognition

HLT 2016 marekrei/sequence-labeler

State-of-the-art named entity recognition systems rely heavily on hand-crafted features and domain-specific knowledge in order to learn effectively from the small, supervised training corpora that are available.

NAMED ENTITY RECOGNITION (NER)

Recurrent Neural Network Grammars

HLT 2016 clab/rnng

We introduce recurrent neural network grammars, probabilistic models of sentences with explicit phrase structure.

CONSTITUENCY PARSING LANGUAGE MODELLING

Learning Global Features for Coreference Resolution

HLT 2016 swiseman/nn_coref

There is compelling evidence that coreference prediction would benefit from modeling global information about entity-clusters.

COREFERENCE RESOLUTION

Top-down Tree Long Short-Term Memory Networks

HLT 2016 XingxingZhang/td-treelstm

Long Short-Term Memory (LSTM) networks, a type of recurrent neural network with a more complex computational unit, have been successfully applied to a variety of sequence modeling tasks.

DEPENDENCY PARSING

Counter-fitting Word Vectors to Linguistic Constraints

HLT 2016 nmrksic/counter-fitting

In this work, we present a novel counter-fitting method which injects antonymy and synonymy constraints into vector space representations in order to improve the vectors' capability for judging semantic similarity.

DIALOGUE STATE TRACKING SEMANTIC TEXTUAL SIMILARITY