CONLL 2016

Joint Learning of the Embedding of Words and Entities for Named Entity Disambiguation

CONLL 2016 wikipedia2vec/wikipedia2vec

The KB graph model learns the relatedness of entities using the link structure of the KB, whereas the anchor context model aims to align vectors such that similar words and entities occur close to one another in the vector space by leveraging KB anchors and their context words.

ENTITY DISAMBIGUATION ENTITY LINKING

Generating Sentences from a Continuous Space

CONLL 2016 timbmg/Sentence-VAE

The standard recurrent neural network language model (RNNLM) generates sentences one word at a time and does not work from an explicit global sentence representation.

LANGUAGE MODELLING

Abstractive Text Summarization Using Sequence-to-Sequence RNNs and Beyond

CONLL 2016 theamrzaki/text_summurization_abstractive_methods

In this work, we model abstractive text summarization using Attentional Encoder-Decoder Recurrent Neural Networks, and show that they achieve state-of-the-art performance on two different corpora.

ABSTRACTIVE TEXT SUMMARIZATION

Greedy, Joint Syntactic-Semantic Parsing with Stack LSTMs

CONLL 2016 clab/joint-lstm-parser

We present a transition-based parser that jointly produces syntactic and semantic dependencies.

SEMANTIC PARSING