AMR Parsing

49 papers with code • 8 benchmarks • 6 datasets

Each AMR is a single rooted, directed graph. AMRs include PropBank semantic roles, within-sentence coreference, named entities and types, modality, negation, questions, quantities, and so on. See.

Graph Pre-training for AMR Parsing and Generation

muyeby/amrbart ACL 2022

To our knowledge, we are the first to consider pre-training on semantic graphs.

89
15 Mar 2022

Maximum Bayes Smatch Ensemble Distillation for AMR Parsing

IBM/transition-amr-parser NAACL 2022

AMR parsing has experienced an unprecendented increase in performance in the last three years, due to a mixture of effects including architecture improvements and transfer learning.

229
14 Dec 2021

Structure-aware Fine-tuning of Sequence-to-sequence Transformers for Transition-based AMR Parsing

IBM/transition-amr-parser EMNLP 2021

We provide a detailed comparison with recent progress in AMR parsing and show that the proposed parser retains the desirable properties of previous transition-based approaches, while being simpler and reaching the new parsing state of the art for AMR 2. 0, without the need for graph re-categorization.

229
29 Oct 2021

Ensembling Graph Predictions for AMR Parsing

ibm/graph_ensemble_learning NeurIPS 2021

In many machine learning tasks, models are trained to predict structure data such as graphs.

37
18 Oct 2021

Hierarchical Curriculum Learning for AMR Parsing

wangpeiyi9979/hcl-text2amr ACL 2022

Abstract Meaning Representation (AMR) parsing aims to translate sentences to semantic representation with a hierarchical structure, and is recently empowered by pretrained sequence-to-sequence models.

13
15 Oct 2021

Multilingual AMR Parsing with Noisy Knowledge Distillation

jcyk/xamr Findings (EMNLP) 2021

We study multilingual AMR parsing from the perspective of knowledge distillation, where the aim is to learn and improve a multilingual AMR parser by using an existing English parser as its teacher.

10
30 Sep 2021

ELIT: Emory Language and Information Toolkit

emorynlp/elit 8 Sep 2021

We introduce ELIT, the Emory Language and Information Toolkit, which is a comprehensive NLP framework providing transformer-based end-to-end models for core tasks with a special focus on memory efficiency while maintaining state-of-the-art accuracy and speed.

36
08 Sep 2021

Levi Graph AMR Parser using Heterogeneous Attention

emorynlp/levi-graph-amr-parser ACL (IWPT) 2021

Coupled with biaffine decoders, transformers have been effectively adapted to text-to-graph transduction and achieved state-of-the-art performance on AMR parsing.

10
09 Jul 2021

Probabilistic, Structure-Aware Algorithms for Improved Variety, Accuracy, and Coverage of AMR Alignments

ablodge/leamr ACL 2021

We present algorithms for aligning components of Abstract Meaning Representation (AMR) graphs to spans in English sentences.

15
10 Jun 2021

Making Better Use of Bilingual Information for Cross-Lingual AMR Parsing

headacheboy/cross-lingual-amr-parsing Findings (ACL) 2021

We argue that the misprediction of concepts is due to the high relevance between English tokens and AMR concepts.

2
09 Jun 2021