AMR Parsing
49 papers with code • 8 benchmarks • 6 datasets
Each AMR is a single rooted, directed graph. AMRs include PropBank semantic roles, within-sentence coreference, named entities and types, modality, negation, questions, quantities, and so on. See.
Most implemented papers
Robust Incremental Neural Semantic Graph Parsing
Parsing sentences to linguistically-expressive semantic representations is a key goal of Natural Language Processing.
A Transition-based Algorithm for Unrestricted AMR Parsing
Non-projective parsing can be useful to handle cycles and reentrancy in AMR graphs.
A Structured Syntax-Semantics Interface for English-AMR Alignment
Abstract Meaning Representation (AMR) annotations are often assumed to closely mirror dependency syntax, but AMR explicitly does not require this, and the assumption has never been tested.
Sequence-to-sequence Models for Cache Transition Systems
In this paper, we present a sequence-to-sequence based approach for mapping natural language sentences to AMR semantic graphs.
An AMR Aligner Tuned by Transition-based Parser
In this paper, we propose a new rich resource enhanced AMR aligner which produces multiple alignments and a new transition system for AMR parsing along with its oracle parser.
AMR Parsing as Sequence-to-Graph Transduction
Our experimental results outperform all previously reported SMATCH scores, on both AMR 2. 0 (76. 3% F1 on LDC2017T10) and AMR 1. 0 (70. 2% F1 on LDC2014T12).