Linguistically-Informed Self-Attention for Semantic Role Labeling

EMNLP 2018 Emma StrubellPatrick VergaDaniel AndorDavid WeissAndrew McCallum

Current state-of-the-art semantic role labeling (SRL) uses a deep neural network with no explicit linguistic features. However, prior work has shown that gold syntax trees can dramatically improve SRL decoding, suggesting the possibility of increased accuracy from explicit modeling of syntax... (read more)

PDF Abstract
Task Dataset Model Metric name Metric value Global rank Compare
Predicate Detection CoNLL 2005 LISA F1 98.4 # 1
Semantic Role Labeling (predicted predicates) CoNLL 2005 LISA + ELMo F1 86.90 # 1
Semantic Role Labeling CoNLL 2005 LISA F1 86.04 # 4
Semantic Role Labeling (predicted predicates) CoNLL 2005 LISA F1 84.99 # 3
Semantic Role Labeling (predicted predicates) CoNLL 2012 LISA F1 82.33 # 3
Semantic Role Labeling (predicted predicates) CoNLL 2012 LISA + ELMo F1 83.38 # 1
Predicate Detection CoNLL 2012 LISA F1 97.2 # 1