AMR-to-Text Generation

15 papers with code • 5 benchmarks • 5 datasets

Abstract Meaning Representation (AMR) is a directed graph of labeled concepts and relations that captures sentence semantics. The propositional meaning behind its concepts abstracts away lexical properties. AMR is tree-like in structure as it has a single root node and few children with multiple parents. The goal of AMR-to-Text Generation is to recover the original sentence realization given an AMR. This task can be seen as the reverse of the structured prediction found in AMR parsing.

Source: AMR-to-Text Generation with Cache Transition Systems

Latest papers with no code

Investigating the Effect of Relative Positional Embeddings on AMR-to-Text Generation with Structural Adapters

no code yet • 12 Feb 2023

Text generation from Abstract Meaning Representation (AMR) has substantially benefited from the popularized Pretrained Language Models (PLMs).

Interpretable AMR-Based Question Decomposition for Multi-hop Question Answering

no code yet • 16 Jun 2022

We then achieve the decomposition of a multi-hop question via segmentation of the corresponding AMR graph based on the required reasoning type.

Graph Pre-training for AMR Parsing and Generation

no code yet • ACL ARR November 2021

To our knowledge, we are the first to consider pre-training on AMR graphs.

AMR-to-text Generation with Graph Structure Reconstruction and Coverage

no code yet • ACL ARR September 2021

To consider the coverage of AMR graphs, we design a coverage mechanism to solve the problem of information under-translation or over-translation in AMR-to-text generation.

Latent Tree Decomposition Parsers for AMR-to-Text Generation

no code yet • 27 Aug 2021

Graph encoders in AMR-to-text generation models often rely on neighborhood convolutions or global vertex attention.

Tree Decomposition Attention for AMR-to-Text Generation

no code yet • 27 Aug 2021

Text generation from AMR requires mapping a semantic graph to a string that it annotates.

Avoiding Overlap in Data Augmentation for AMR-to-Text Generation

no code yet • ACL 2021

We propose methods for excluding parts of Gigaword to remove this overlap, and show that our approach leads to a more realistic evaluation of the task of AMR-to-text generation.

XLPT-AMR: Cross-Lingual Pre-Training via Multi-Task Learning for Zero-Shot AMR Parsing and Text Generation

no code yet • ACL 2021

We hope that knowledge gained while learning for English AMR parsing and text generation can be transferred to the counterparts of other languages.

Generalized Shortest-Paths Encoders for AMR-to-Text Generation

no code yet • COLING 2020

Instead of feeding shortest paths to the vertex self-attention module, we train a model to learn them using generalized shortest-paths algorithms.

Multilingual AMR-to-Text Generation

no code yet • EMNLP 2020

Generating text from structured data is challenging because it requires bridging the gap between (i) structure and natural language (NL) and (ii) semantically underspecified input and fully specified NL output.