AMR Parsing

49 papers with code • 8 benchmarks • 6 datasets

Each AMR is a single rooted, directed graph. AMRs include PropBank semantic roles, within-sentence coreference, named entities and types, modality, negation, questions, quantities, and so on. See.

Translate, then Parse! A strong baseline for Cross-Lingual AMR Parsing

Heidelberg-NLP/simple-xamr ACL (IWPT) 2021

In cross-lingual Abstract Meaning Representation (AMR) parsing, researchers develop models that project sentences from various languages onto their AMRs to capture their essential semantic structures: given a sentence in any language, we aim to capture its core semantic content through concepts connected by manifold types of semantic relations.

4
08 Jun 2021

One SPRING to Rule Them Both: Symmetric AMR Semantic Parsing and Generation without a Complex Pipeline

SapienzaNLP/spring Proceedings of the AAAI Conference on Artificial Intelligence 2021

In Text-to-AMR parsing, current state-of-the-art semantic parsers use cumbersome pipelines integrating several different modules or components, and exploit graph recategorization, i. e., a set of content-specific heuristics that are developed on the basis of the training set.

121
18 May 2021

AMR Parsing with Action-Pointer Transformer

ibm/graph_ensemble_learning NAACL 2021

In this work, we propose a transition-based system that combines hard-attention over sentences with a target-side action pointer mechanism to decouple source tokens from node representations and address alignments.

37
29 Apr 2021

The Role of Reentrancies in Abstract Meaning Representation Parsing

mdtux89/amr-reentrancies Findings of the Association for Computational Linguistics 2020

Abstract Meaning Representation (AMR) parsing aims at converting sentences into AMR representations.

2
01 Nov 2020

Pushing the Limits of AMR Parsing with Self-Learning

IBM/transition-amr-parser Findings of the Association for Computational Linguistics 2020

Abstract Meaning Representation (AMR) parsing has experienced a notable growth in performance in the last two years, due both to the impact of transfer learning and the development of novel architectures specific to AMR.

230
20 Oct 2020
230
20 Oct 2020

Improving AMR Parsing with Sequence-to-Sequence Pre-training

xdqkid/S2S-AMR-Parser EMNLP 2020

In the literature, the research on abstract meaning representation (AMR) parsing is much restricted by the size of human-curated dataset which is critical to build an AMR parser with good performance.

42
05 Oct 2020

AMR Parsing via Graph-Sequence Iterative Inference

bjascob/amrlib ACL 2020

We propose a new end-to-end model that treats AMR parsing as a series of dual decisions on the input sequence and the incrementally constructed graph.

212
12 Apr 2020

Core Semantic First: A Top-down Approach for AMR Parsing

jcyk/AMR-parser IJCNLP 2019

The output graph spans the nodes by the distance to the root, following the intuition of first grasping the main ideas then digging into more details.

11
10 Sep 2019

SemBleu: A Robust Metric for AMR Parsing Evaluation

gotheregit/chinese-amr ACL 2019

Evaluating AMR parsing accuracy involves comparing pairs of AMR graphs.

34
26 May 2019