|TREND||DATASET||BEST METHOD||PAPER TITLE||PAPER||CODE||COMPARE|
We propose a technique for learning representations of parser states in transition-based dependency parsers.
We present a novel neural network model that learns POS tagging and graph-based dependency parsing jointly.
#4 best model for Part-Of-Speech Tagging on UD
We first present a minimal feature set for transition-based dependency parsing, continuing a recent trend started by Kiperwasser and Goldberg (2016a) and Cross and Huang (2016a) of using bi-directional LSTM features.
Transition-based approaches based on local classification are attractive for dependency parsing due to their simplicity and speed, despite producing results slightly below the state-of-the-art.