About

Each AMR is a single rooted, directed graph. AMRs include PropBank semantic roles, within-sentence coreference, named entities and types, modality, negation, questions, quantities, and so on. See.

Benchmarks

TREND DATASET BEST METHOD PAPER TITLE PAPER CODE COMPARE

Datasets

Greatest papers with code

AMR Parsing as Sequence-to-Graph Transduction

ACL 2019 sheng-z/stog

Our experimental results outperform all previously reported SMATCH scores, on both AMR 2. 0 (76. 3% F1 on LDC2017T10) and AMR 1. 0 (70. 2% F1 on LDC2014T12).

AMR PARSING

AMR Parsing via Graph-Sequence Iterative Inference

ACL 2020 jcyk/AMR-gs

We propose a new end-to-end model that treats AMR parsing as a series of dual decisions on the input sequence and the incrementally constructed graph.

Ranked #2 on AMR Parsing on LDC2014T12 (F1 Full metric)

AMR PARSING LANGUAGE MODELLING

AMR Parsing as Graph Prediction with Latent Alignment

ACL 2018 ChunchuanLv/AMR_AS_GRAPH_PREDICTION

AMR parsing is challenging partly due to the lack of annotated alignments between nodes in the graphs and words in the corresponding sentences.

AMR PARSING

Robust Incremental Neural Semantic Graph Parsing

ACL 2017 janmbuys/DeepDeepParser

Parsing sentences to linguistically-expressive semantic representations is a key goal of Natural Language Processing.

AMR PARSING

Neural Semantic Parsing by Character-based Translation: Experiments with Abstract Meaning Representations

28 May 2017RikVN/AMR

We evaluate the character-level translation method for neural semantic parsing on a large corpus of sentences annotated with Abstract Meaning Representations (AMRs).

AMR PARSING

Improving AMR Parsing with Sequence-to-Sequence Pre-training

EMNLP 2020 xdqkid/S2S-AMR-Parser

In the literature, the research on abstract meaning representation (AMR) parsing is much restricted by the size of human-curated dataset which is critical to build an AMR parser with good performance.

AMR PARSING MACHINE TRANSLATION MULTI-TASK LEARNING