AMR Parsing as Sequence-to-Graph Transduction

ACL 2019  Β·  Sheng Zhang, Xutai Ma, Kevin Duh, Benjamin Van Durme Β·

We propose an attention-based model that treats AMR parsing as sequence-to-graph transduction. Unlike most AMR parsers that rely on pre-trained aligners, external semantic resources, or data augmentation, our proposed parser is aligner-free, and it can be effectively trained with limited amounts of labeled AMR data. Our experimental results outperform all previously reported SMATCH scores, on both AMR 2.0 (76.3% F1 on LDC2017T10) and AMR 1.0 (70.2% F1 on LDC2014T12).

PDF Abstract ACL 2019 PDF ACL 2019 Abstract


Results from the Paper

Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
AMR Parsing LDC2014T12 Two-stage Sequence-to-Graph Transducer F1 Full 70.2 # 5
AMR Parsing LDC2014T12: Sequence-to-Graph Transduction F1 Newswire 0.75 # 1
F1 Full 0.70 # 1
AMR Parsing LDC2017T10 Sequence-to-Graph Transduction Smatch 76.3 # 21


No methods listed for this paper. Add relevant methods here