TACL 2016

Named Entity Recognition with Bidirectional LSTM-CNNs

TACL 2016 zalandoresearch/flair

Named entity recognition is a challenging task that has traditionally required large amounts of knowledge in the form of feature engineering and lexicons to achieve high performance.

ENTITY LINKING FEATURE ENGINEERING NAMED ENTITY RECOGNITION (NER) WORD EMBEDDINGS

Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies

TACL 2016 yoavg/bert-syntax

The success of long short-term memory (LSTM) neural networks in language processing is typically attributed to their ability to capture long-distance statistical regularities.

LANGUAGE MODELLING

ABCNN: Attention-Based Convolutional Neural Network for Modeling Sentence Pairs

TACL 2016 Leputa/CIKM-AnalytiCup-2018

(ii) We propose three attention schemes that integrate mutual influence between sentences into CNN; thus, the representation of each sentence takes into consideration its counterpart.

ANSWER SELECTION NATURAL LANGUAGE INFERENCE PARAPHRASE IDENTIFICATION

Learning to Understand Phrases by Embedding the Dictionary

TACL 2016 northanapon/dict-definition

Distributional models that learn rich semantic word representations are a success story of recent NLP research.

Simple and Accurate Dependency Parsing Using Bidirectional LSTM Feature Representations

TACL 2016 ITUnlp/UniParse

The BiLSTM is trained jointly with the parser objective, resulting in very effective feature extractors for parsing.

DEPENDENCY PARSING