AMR Parsing as Sequence-to-Graph Transduction

ACL 2019 Sheng ZhangXutai MaKevin DuhBenjamin Van Durme

We propose an attention-based model that treats AMR parsing as sequence-to-graph transduction. Unlike most AMR parsers that rely on pre-trained aligners, external semantic resources, or data augmentation, our proposed parser is aligner-free, and it can be effectively trained with limited amounts of labeled AMR data... (read more)

PDF Abstract

Evaluation Results from the Paper


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK COMPARE
Amr Parsing LDC2014T12: Sequence-to-Graph Transduction F1 Newswire 0.75 # 1
Amr Parsing LDC2014T12: Sequence-to-Graph Transduction F1 Full 0.70 # 1
Amr Parsing LDC2017T10 Sequence-to-Graph Transduction F1 0.76 # 1