IJCNLP 2015

Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks

IJCNLP 2015 tensorflow/fold

Because of their superior ability to preserve sequence information over time, Long Short-Term Memory (LSTM) networks, a type of recurrent neural network with a more complex computational unit, have obtained strong results on a variety of sequence modeling tasks.

SENTIMENT ANALYSIS

On Using Very Large Target Vocabulary for Neural Machine Translation

IJCNLP 2015 HIT-SCIR/ELMoForManyLangs

The models trained by the proposed approach are empirically found to outperform the baseline models with a small vocabulary as well as the LSTM-based neural machine translation models.

MACHINE TRANSLATION

Addressing the Rare Word Problem in Neural Machine Translation

IJCNLP 2015 atpaino/deep-text-corrector

Our experiments on the WMT14 English to French translation task show that this method provides a substantial improvement of up to 2. 8 BLEU points over an equivalent NMT system that does not use this technique.

MACHINE TRANSLATION WORD ALIGNMENT

Transition-Based Dependency Parsing with Stack Long Short-Term Memory

IJCNLP 2015 elikip/bist-parser

We propose a technique for learning representations of parser states in transition-based dependency parsers.

TRANSITION-BASED DEPENDENCY PARSING

Neural Responding Machine for Short-Text Conversation

IJCNLP 2015 tuxchow/ecm

We propose Neural Responding Machine (NRM), a neural network-based response generator for Short-Text Conversation.

SHORT-TEXT CONVERSATION