AMR Parsing

49 papers with code • 8 benchmarks • 6 datasets

Each AMR is a single rooted, directed graph. AMRs include PropBank semantic roles, within-sentence coreference, named entities and types, modality, negation, questions, quantities, and so on. See.

Most implemented papers

Robust Incremental Neural Semantic Graph Parsing

janmbuys/DeepDeepParser ACL 2017

Parsing sentences to linguistically-expressive semantic representations is a key goal of Natural Language Processing.

A Transition-based Algorithm for Unrestricted AMR Parsing

aghie/tb-amr NAACL 2018

Non-projective parsing can be useful to handle cycles and reentrancy in AMR graphs.

A Structured Syntax-Semantics Interface for English-AMR Alignment

ida-szubert/amr_ud NAACL 2018

Abstract Meaning Representation (AMR) annotations are often assumed to closely mirror dependency syntax, but AMR explicitly does not require this, and the assumption has never been tested.

Sequence-to-sequence Models for Cache Transition Systems

xiaochang13/CacheTransition-Seq2seq ACL 2018

In this paper, we present a sequence-to-sequence based approach for mapping natural language sentences to AMR semantic graphs.

An AMR Aligner Tuned by Transition-based Parser

Oneplus/tamr EMNLP 2018

In this paper, we propose a new rich resource enhanced AMR aligner which produces multiple alignments and a new transition system for AMR parsing along with its oracle parser.

AMR Parsing as Sequence-to-Graph Transduction

sheng-z/stog ACL 2019

Our experimental results outperform all previously reported SMATCH scores, on both AMR 2. 0 (76. 3% F1 on LDC2017T10) and AMR 1. 0 (70. 2% F1 on LDC2014T12).