AMR Parsing
36 papers with code • 8 benchmarks • 6 datasets
Each AMR is a single rooted, directed graph. AMRs include PropBank semantic roles, within-sentence coreference, named entities and types, modality, negation, questions, quantities, and so on. See.
Most implemented papers
An Incremental Parser for Abstract Meaning Representation
We describe a transition-based parser for AMR that parses sentences left-to-right, in linear time.
Neural AMR: Sequence-to-Sequence Models for Parsing and Generation
Sequence-to-sequence models have shown strong performance across a broad range of applications.
AMR Parsing via Graph-Sequence Iterative Inference
We propose a new end-to-end model that treats AMR parsing as a series of dual decisions on the input sequence and the incrementally constructed graph.
RIGA at SemEval-2016 Task 8: Impact of Smatch Extensions and Character-Level Neural Translation on AMR Parsing Accuracy
The first extension com-bines the smatch scoring script with the C6. 0 rule-based classifier to produce a human-readable report on the error patterns frequency observed in the scored AMR graphs.
Neural Semantic Parsing by Character-based Translation: Experiments with Abstract Meaning Representations
We evaluate the character-level translation method for neural semantic parsing on a large corpus of sentences annotated with Abstract Meaning Representations (AMRs).
AMR Parsing as Graph Prediction with Latent Alignment
AMR parsing is challenging partly due to the lack of annotated alignments between nodes in the graphs and words in the corresponding sentences.
ATP: AMRize Then Parse! Enhancing AMR Parsing with PseudoAMRs
As Abstract Meaning Representation (AMR) implicitly involves compound semantic annotations, we hypothesize auxiliary tasks which are semantically or formally related can better enhance AMR parsing.
Robust Incremental Neural Semantic Graph Parsing
Parsing sentences to linguistically-expressive semantic representations is a key goal of Natural Language Processing.