AMR Parsing
49 papers with code • 8 benchmarks • 6 datasets
Each AMR is a single rooted, directed graph. AMRs include PropBank semantic roles, within-sentence coreference, named entities and types, modality, negation, questions, quantities, and so on. See.
Latest papers
Graph Pre-training for AMR Parsing and Generation
To our knowledge, we are the first to consider pre-training on semantic graphs.
Maximum Bayes Smatch Ensemble Distillation for AMR Parsing
AMR parsing has experienced an unprecendented increase in performance in the last three years, due to a mixture of effects including architecture improvements and transfer learning.
Structure-aware Fine-tuning of Sequence-to-sequence Transformers for Transition-based AMR Parsing
We provide a detailed comparison with recent progress in AMR parsing and show that the proposed parser retains the desirable properties of previous transition-based approaches, while being simpler and reaching the new parsing state of the art for AMR 2. 0, without the need for graph re-categorization.
Ensembling Graph Predictions for AMR Parsing
In many machine learning tasks, models are trained to predict structure data such as graphs.
Hierarchical Curriculum Learning for AMR Parsing
Abstract Meaning Representation (AMR) parsing aims to translate sentences to semantic representation with a hierarchical structure, and is recently empowered by pretrained sequence-to-sequence models.
Multilingual AMR Parsing with Noisy Knowledge Distillation
We study multilingual AMR parsing from the perspective of knowledge distillation, where the aim is to learn and improve a multilingual AMR parser by using an existing English parser as its teacher.
ELIT: Emory Language and Information Toolkit
We introduce ELIT, the Emory Language and Information Toolkit, which is a comprehensive NLP framework providing transformer-based end-to-end models for core tasks with a special focus on memory efficiency while maintaining state-of-the-art accuracy and speed.
Levi Graph AMR Parser using Heterogeneous Attention
Coupled with biaffine decoders, transformers have been effectively adapted to text-to-graph transduction and achieved state-of-the-art performance on AMR parsing.
Probabilistic, Structure-Aware Algorithms for Improved Variety, Accuracy, and Coverage of AMR Alignments
We present algorithms for aligning components of Abstract Meaning Representation (AMR) graphs to spans in English sentences.
Making Better Use of Bilingual Information for Cross-Lingual AMR Parsing
We argue that the misprediction of concepts is due to the high relevance between English tokens and AMR concepts.