ATP: AMRize Then Parse! Enhancing AMR Parsing with PseudoAMRs

As Abstract Meaning Representation (AMR) implicitly involves compound semantic annotations, we hypothesize auxiliary tasks which are semantically or formally related can better enhance AMR parsing. We find that 1) Semantic role labeling (SRL) and dependency parsing (DP), would bring more performance gain than other tasks e.g. MT and summarization in the text-to-AMR transition even with much less data. 2) To make a better fit for AMR, data from auxiliary tasks should be properly "AMRized" to PseudoAMR before training. Knowledge from shallow level parsing tasks can be better transferred to AMR Parsing with structure transform. 3) Intermediate-task learning is a better paradigm to introduce auxiliary tasks to AMR parsing, compared to multitask learning. From an empirical perspective, we propose a principled method to involve auxiliary tasks to boost AMR parsing. Extensive experiments show that our method achieves new state-of-the-art performance on different benchmarks especially in topology-related scores.

PDF Abstract Findings (NAACL) 2022 PDF Findings (NAACL) 2022 Abstract

Results from the Paper


Ranked #7 on AMR Parsing on LDC2020T02 (using extra training data)

     Get a GitHub badge
Task Dataset Model Metric Name Metric Value Global Rank Uses Extra
Training Data
Result Benchmark
AMR Parsing LDC2017T10 ATP-SRL Smatch 85.2 # 8
AMR Parsing LDC2017T10 ATP-SRL (Ensemble) Smatch 85.3 # 7
AMR Parsing LDC2020T02 ATP-SRL Smatch 84.0 # 7

Methods


No methods listed for this paper. Add relevant methods here