AMR-to-Text Generation
15 papers with code • 5 benchmarks • 5 datasets
Abstract Meaning Representation (AMR) is a directed graph of labeled concepts and relations that captures sentence semantics. The propositional meaning behind its concepts abstracts away lexical properties. AMR is tree-like in structure as it has a single root node and few children with multiple parents. The goal of AMR-to-Text Generation is to recover the original sentence realization given an AMR. This task can be seen as the reverse of the structured prediction found in AMR parsing.
Source: AMR-to-Text Generation with Cache Transition Systems
Latest papers
Graph Pre-training for AMR Parsing and Generation
To our knowledge, we are the first to consider pre-training on semantic graphs.
Smelting Gold and Silver for Improved Multilingual AMR-to-Text Generation
Recent work on multilingual AMR-to-text generation has exclusively focused on data augmentation strategies that utilize silver AMR.
One SPRING to Rule Them Both: Symmetric AMR Semantic Parsing and Generation without a Complex Pipeline
In Text-to-AMR parsing, current state-of-the-art semantic parsers use cumbersome pipelines integrating several different modules or components, and exploit graph recategorization, i. e., a set of content-specific heuristics that are developed on the basis of the training set.
Structural Adapters in Pretrained Language Models for AMR-to-text Generation
Pretrained language models (PLM) have recently advanced graph-to-text generation, where the input graph is linearized into a sequence and fed into the PLM to obtain its representation.
Lightweight, Dynamic Graph Convolutional Networks for AMR-to-Text Generation
With the help of these strategies, we are able to train a model with fewer parameters while maintaining the model capacity.
Online Back-Parsing for AMR-to-Text Generation
AMR-to-text generation aims to recover a text containing the same meaning as an input AMR graph.
Investigating Pretrained Language Models for Graph-to-Text Generation
We show that the PLMs BART and T5 achieve new state-of-the-art results and that task-adaptive pretraining strategies improve their performance even further.
GPT-too: A language-model-first approach for AMR-to-text generation
Meaning Representations (AMRs) are broad-coverage sentence-level semantic graphs.
Have Your Text and Use It Too! End-to-End Neural Data-to-Text Generation with Semantic Fidelity
Our generated text has a significantly better semantic fidelity than the state of the art across all four datasets
Graph Transformer for Graph-to-Sequence Learning
The dominant graph-to-sequence transduction models employ graph neural networks for graph representation learning, where the structural information is reflected by the receptive field of neurons.