ACL 2016

Globally Normalized Transition-Based Neural Networks

ACL 2016 tensorflow/models

Our model is a simple feed-forward neural network that operates on a task-specific transition system, yet achieves comparable or better accuracies than recurrent models.

DEPENDENCY PARSING PART-OF-SPEECH TAGGING SENTENCE COMPRESSION

Neural Machine Translation of Rare Words with Subword Units

ACL 2016 facebookresearch/fairseq-py

Neural machine translation (NMT) models typically operate with a fixed vocabulary, but translation is an open-vocabulary problem.

TRANSLITERATION

End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF

ACL 2016 guillaumegenthial/sequence_tagging

State-of-the-art sequence labeling systems traditionally require large amounts of task-specific knowledge in the form of hand-crafted features and data pre-processing.

FEATURE ENGINEERING NAMED ENTITY RECOGNITION (NER) PART-OF-SPEECH TAGGING

Matrix Factorization using Window Sampling and Negative Sampling for Improved Word Representations

ACL 2016 alexandres/lexvec

In this paper, we propose LexVec, a new method for generating distributed word representations that uses low-rank, weighted factorization of the Positive Point-wise Mutual Information matrix via stochastic gradient descent, employing a weighting scheme that assigns heavier penalties for errors on frequent co-occurrences while still accounting for negative co-occurrence.

A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task

ACL 2016 danqi/rc-cnn-dailymail

Enabling a computer to understand a document so that it can answer comprehension questions is a central, yet unsolved goal of NLP.

READING COMPREHENSION

Tweet2Vec: Character-Based Distributed Representations for Social Media

ACL 2016 bdhingra/tweet2vec

Text from social media provides a set of challenges that can cause traditional NLP approaches to fail.

Diachronic Word Embeddings Reveal Statistical Laws of Semantic Change

ACL 2016 williamleif/histwords

Understanding how words change their meanings over time is key to models of language and cultural evolution, but historical data on meaning is scarce, making theories hard to develop and test.

WORD EMBEDDINGS