AMR-to-Text Generation
15 papers with code • 5 benchmarks • 5 datasets
Abstract Meaning Representation (AMR) is a directed graph of labeled concepts and relations that captures sentence semantics. The propositional meaning behind its concepts abstracts away lexical properties. AMR is tree-like in structure as it has a single root node and few children with multiple parents. The goal of AMR-to-Text Generation is to recover the original sentence realization given an AMR. This task can be seen as the reverse of the structured prediction found in AMR parsing.
Source: AMR-to-Text Generation with Cache Transition Systems
Most implemented papers
Investigating Pretrained Language Models for Graph-to-Text Generation
We show that the PLMs BART and T5 achieve new state-of-the-art results and that task-adaptive pretraining strategies improve their performance even further.
Structural Neural Encoders for AMR-to-text Generation
AMR-to-text generation is a problem recently introduced to the NLP community, in which the goal is to generate sentences from Abstract Meaning Representation (AMR) graphs.
Graph Pre-training for AMR Parsing and Generation
To our knowledge, we are the first to consider pre-training on semantic graphs.
A Graph-to-Sequence Model for AMR-to-Text Generation
The problem of AMR-to-text generation is to recover a text representing the same meaning as an input AMR graph.
Modeling Graph Structure in Transformer for Better AMR-to-Text Generation
Recent studies on AMR-to-text generation often formalize the task as a sequence-to-sequence (seq2seq) learning problem by converting an Abstract Meaning Representation (AMR) graph into a word sequence.
Enhancing AMR-to-Text Generation with Dual Graph Representations
Generating text from graph-based data, such as Abstract Meaning Representation (AMR), is a challenging task due to the inherent difficulty in how to properly encode the structure of a graph with labeled edges.
Graph Transformer for Graph-to-Sequence Learning
The dominant graph-to-sequence transduction models employ graph neural networks for graph representation learning, where the structural information is reflected by the receptive field of neurons.
Have Your Text and Use It Too! End-to-End Neural Data-to-Text Generation with Semantic Fidelity
Our generated text has a significantly better semantic fidelity than the state of the art across all four datasets
GPT-too: A language-model-first approach for AMR-to-text generation
Meaning Representations (AMRs) are broad-coverage sentence-level semantic graphs.
Online Back-Parsing for AMR-to-Text Generation
AMR-to-text generation aims to recover a text containing the same meaning as an input AMR graph.