TACL 2016

Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies

TACL 2016 ketranm/fan_vs_rnn

The success of long short-term memory (LSTM) neural networks in language processing is typically attributed to their ability to capture long-distance statistical regularities.

LANGUAGE MODELLING

Learning to Understand Phrases by Embedding the Dictionary

TACL 2016 northanapon/dict-definition

Distributional models that learn rich semantic word representations are a success story of recent NLP research.

Simple and Accurate Dependency Parsing Using Bidirectional LSTM Feature Representations

TACL 2016 ITUnlp/UniParse

The BiLSTM is trained jointly with the parser objective, resulting in very effective feature extractors for parsing.

DEPENDENCY PARSING

Fast, Small and Exact: Infinite-order Language Modelling with Compressed Suffix Trees

TACL 2016 eehsan/cstlm

Efficient methods for storing and querying are critical for scaling high-order n-gram language models to large corpora.

LANGUAGE MODELLING

ABCNN: Attention-Based Convolutional Neural Network for Modeling Sentence Pairs

TACL 2016 kinimod23/ATS_Project

(ii) We propose three attention schemes that integrate mutual influence between sentences into CNN; thus, the representation of each sentence takes into consideration its counterpart.

ANSWER SELECTION NATURAL LANGUAGE INFERENCE PARAPHRASE IDENTIFICATION