Second-Order Neural Dependency Parsing with Message Passing and End-to-End Training

10 Oct 2020 Xinyu Wang Kewei Tu

In this paper, we propose second-order graph-based neural dependency parsing using message passing and end-to-end neural networks. We empirically show that our approaches match the accuracy of very recent state-of-the-art second-order graph-based neural dependency parsers and have significantly faster speed in both training and testing... (read more)

PDF Abstract

Results from the Paper


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK BENCHMARK
Dependency Parsing Chinese Treebank MFVI LAS 91.69 # 1
UAS 92.78 # 1
Dependency Parsing Penn Treebank MFVI UAS 96.91 # 5
LAS 95.34 # 5

Methods used in the Paper


METHOD TYPE
Adam
Stochastic Optimization
Dense Connections
Feedforward Networks
WordPiece
Subword Segmentation
Multi-Head Attention
Attention Modules
Layer Normalization
Normalization
Linear Warmup With Linear Decay
Learning Rate Schedules
Attention Dropout
Regularization
Weight Decay
Regularization
Dropout
Regularization
Scaled Dot-Product Attention
Attention Mechanisms
GELU
Activation Functions
Residual Connection
Skip Connections
Softmax
Output Functions
BERT
Language Models