AMR Parsing
49 papers with code • 8 benchmarks • 6 datasets
Each AMR is a single rooted, directed graph. AMRs include PropBank semantic roles, within-sentence coreference, named entities and types, modality, negation, questions, quantities, and so on. See.
Latest papers
Translate, then Parse! A strong baseline for Cross-Lingual AMR Parsing
In cross-lingual Abstract Meaning Representation (AMR) parsing, researchers develop models that project sentences from various languages onto their AMRs to capture their essential semantic structures: given a sentence in any language, we aim to capture its core semantic content through concepts connected by manifold types of semantic relations.
One SPRING to Rule Them Both: Symmetric AMR Semantic Parsing and Generation without a Complex Pipeline
In Text-to-AMR parsing, current state-of-the-art semantic parsers use cumbersome pipelines integrating several different modules or components, and exploit graph recategorization, i. e., a set of content-specific heuristics that are developed on the basis of the training set.
AMR Parsing with Action-Pointer Transformer
In this work, we propose a transition-based system that combines hard-attention over sentences with a target-side action pointer mechanism to decouple source tokens from node representations and address alignments.
The Role of Reentrancies in Abstract Meaning Representation Parsing
Abstract Meaning Representation (AMR) parsing aims at converting sentences into AMR representations.
Pushing the Limits of AMR Parsing with Self-Learning
Abstract Meaning Representation (AMR) parsing has experienced a notable growth in performance in the last two years, due both to the impact of transfer learning and the development of novel architectures specific to AMR.
Transition-based Parsing with Stack-Transformers
Modeling the parser state is key to good performance in transition-based parsing.
Improving AMR Parsing with Sequence-to-Sequence Pre-training
In the literature, the research on abstract meaning representation (AMR) parsing is much restricted by the size of human-curated dataset which is critical to build an AMR parser with good performance.
AMR Parsing via Graph-Sequence Iterative Inference
We propose a new end-to-end model that treats AMR parsing as a series of dual decisions on the input sequence and the incrementally constructed graph.
Core Semantic First: A Top-down Approach for AMR Parsing
The output graph spans the nodes by the distance to the root, following the intuition of first grasping the main ideas then digging into more details.
SemBleu: A Robust Metric for AMR Parsing Evaluation
Evaluating AMR parsing accuracy involves comparing pairs of AMR graphs.