AMR Parsing
49 papers with code • 8 benchmarks • 6 datasets
Each AMR is a single rooted, directed graph. AMRs include PropBank semantic roles, within-sentence coreference, named entities and types, modality, negation, questions, quantities, and so on. See.
Latest papers with no code
XLPT-AMR: Cross-Lingual Pre-Training via Multi-Task Learning for Zero-Shot AMR Parsing and Text Generation
We hope that knowledge gained while learning for English AMR parsing and text generation can be transferred to the counterparts of other languages.
SGL: Speaking the Graph Languages of Semantic Parsing via Multilingual Translation
Graph-based semantic parsing aims to represent textual meaning through directed graphs.
Structure-aware Fine-tuning of Sequence-to-sequence Transformers for Transition-based AMR Parsing
We provide a detailed comparison with recent progress in AMR parsing and show that the proposed parser retains the desirable properties of previous transition-based approaches, while being simpler and reaching the new parsing state of the art for AMR 2. 0.
Parsing Indonesian Sentence into Abstract Meaning Representation using Machine Learning Approach
Pair prediction uses dependency parsing component to get the edges between the words for the AMR.
AMR Parsing with Action-Pointer Transformer
Abstract Meaning Representation parsing belongs to a category of sentence-to-graph prediction tasks where the target graph is not explicitly linked to the sentence tokens.
JBNU at MRP 2020: AMR Parsing Using a Joint State Model for Graph-Sequence Iterative Inference
This paper describes the Jeonbuk National University (JBNU) system for the 2020 shared task on Cross-Framework Meaning Representation Parsing at the Conference on Computational Natural Language Learning.
A Differentiable Relaxation of Graph Segmentation and Alignment for AMR Parsing
In contrast, we treat both alignment and segmentation as latent variables in our model and induce them as part of end-to-end training.
AMR Parsing with Latent Structural Information
Abstract Meaning Representations (AMRs) capture sentence-level semantics structural representations to broad-coverage natural sentences.
Constructing Web-Accessible Semantic Role Labels and Frames for Japanese as Additions to the NPCMJ Parsed Corpus
As part of constructing the NINJAL Parsed Corpus of Modern Japanese (NPCMJ), a web-accessible language resource, we are adding frame information for predicates, together with two types of semantic role labels that mark the contributions of arguments.
Broad-Coverage Semantic Parsing as Transduction
We unify different broad-coverage semantic parsing tasks under a transduction paradigm, and propose an attention-based neural framework that incrementally builds a meaning representation via a sequence of semantic relations.