NAACL 2016

Neural Architectures for Named Entity Recognition

NAACL 2016 zalandoresearch/flair

State-of-the-art named entity recognition systems rely heavily on hand-crafted features and domain-specific knowledge in order to learn effectively from the small, supervised training corpora that are available.

NAMED ENTITY RECOGNITION

A Diversity-Promoting Objective Function for Neural Conversation Models

NAACL 2016 pender/chatbot-rnn

Sequence-to-sequence neural network models for generation of conversational responses tend to generate safe, commonplace responses (e. g., "I don't know") regardless of the input.

Learning to Compose Neural Networks for Question Answering

NAACL 2016 jacobandreas/nmn2

We describe a question answering model that applies to both images and structured knowledge bases.

QUESTION ANSWERING

Learning Natural Language Inference with LSTM

NAACL 2016 shuohangwang/SeqMatchSeq

On the SNLI corpus, our model achieves an accuracy of 86. 1%, outperforming the state of the art.

NATURAL LANGUAGE INFERENCE SENTENCE EMBEDDINGS

Recurrent Neural Network Grammars

NAACL 2016 clab/rnng

We introduce recurrent neural network grammars, probabilistic models of sentences with explicit phrase structure.

CONSTITUENCY PARSING LANGUAGE MODELLING

Learning Global Features for Coreference Resolution

NAACL 2016 swiseman/nn_coref

There is compelling evidence that coreference prediction would benefit from modeling global information about entity-clusters.

COREFERENCE RESOLUTION

Counter-fitting Word Vectors to Linguistic Constraints

NAACL 2016 nmrksic/counter-fitting

In this work, we present a novel counter-fitting method which injects antonymy and synonymy constraints into vector space representations in order to improve the vectors' capability for judging semantic similarity.

DIALOGUE STATE TRACKING SEMANTIC TEXTUAL SIMILARITY

Top-down Tree Long Short-Term Memory Networks

NAACL 2016 XingxingZhang/td-treelstm

Long Short-Term Memory (LSTM) networks, a type of recurrent neural network with a more complex computational unit, have been successfully applied to a variety of sequence modeling tasks.

DEPENDENCY PARSING