Broad-Coverage Semantic Parsing as Transduction

We unify different broad-coverage semantic parsing tasks under a transduction paradigm, and propose an attention-based neural framework that incrementally builds a meaning representation via a sequence of semantic relations. By leveraging multiple attention mechanisms, the transducer can be effectively trained without relying on a pre-trained aligner. Experiments conducted on three separate broad-coverage semantic parsing tasks -- AMR, SDP and UCCA -- demonstrate that our attention-based neural transducer improves the state of the art on both AMR and UCCA, and is competitive with the state of the art on SDP.

PDF Abstract IJCNLP 2019 PDF IJCNLP 2019 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
AMR Parsing LDC2014T12 Broad-Coverage Semantic Parsing as Transduction F1 Full 71.3 # 3
AMR Parsing LDC2017T10 Zhang et al. Smatch 77.0 # 14
UCCA Parsing SemEval 2019 Task 1 Neural Transducer English-Wiki (open) F1 76.6 # 2


No methods listed for this paper. Add relevant methods here