Simple and Accurate Dependency Parsing Using Bidirectional LSTM Feature Representations

TACL 2016  ·  Eliyahu Kiperwasser, Yoav Goldberg ·

We present a simple and effective scheme for dependency parsing which is based on bidirectional-LSTMs (BiLSTMs). Each sentence token is associated with a BiLSTM vector representing the token in its sentential context, and feature vectors are constructed by concatenating a few BiLSTM vectors. The BiLSTM is trained jointly with the parser objective, resulting in very effective feature extractors for parsing. We demonstrate the effectiveness of the approach by applying it to a greedy transition-based parser as well as to a globally optimized graph-based parser. The resulting parsers have very simple architectures, and match or surpass the state-of-the-art accuracies on English and Chinese.

PDF Abstract TACL 2016 PDF TACL 2016 Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Chinese Dependency Parsing Chinese Pennbank Kiperwasser and Eliyahu (2016) LAS 86.1 # 3
UAS 87.6 # 4
Dependency Parsing Penn Treebank BIST transition-based parser POS 97.44 # 2
UAS 93.99 # 20
LAS 91.9 # 20
Dependency Parsing Penn Treebank BIST graph-based parser POS 97.3 # 5
UAS 93.1 # 22
LAS 91.0 # 22

Methods