AMR-to-Text Generation

9 papers with code • 2 benchmarks • 1 datasets

Abstract Meaning Representation (AMR) is a directed graph of labeled concepts and relations that captures sentence semantics. The propositional meaning behind its concepts abstracts away lexical properties. AMR is tree-like in structure as it has a single root node and few children with multiple parents. The goal of AMR-to-Text Generation is to recover the original sentence realization given an AMR. This task can be seen as the reverse of the structured prediction found in AMR parsing.

Source: AMR-to-Text Generation with Cache Transition Systems

Datasets


Greatest papers with code

Graph Transformer for Graph-to-Sequence Learning

jcyk/gtos 18 Nov 2019

The dominant graph-to-sequence transduction models employ graph neural networks for graph representation learning, where the structural information is reflected by the receptive field of neurons.

AMR-to-Text Generation Graph Representation Learning +3

A Graph-to-Sequence Model for AMR-to-Text Generation

freesunshine0316/neural-graph-to-seq-mp ACL 2018

The problem of AMR-to-text generation is to recover a text representing the same meaning as an input AMR graph.

 Ranked #1 on Graph-to-Sequence on LDC2015E86: (using extra training data)

AMR-to-Text Generation Graph-to-Sequence +1

Modeling Graph Structure in Transformer for Better AMR-to-Text Generation

Amazing-J/structural-transformer IJCNLP 2019

Recent studies on AMR-to-text generation often formalize the task as a sequence-to-sequence (seq2seq) learning problem by converting an Abstract Meaning Representation (AMR) graph into a word sequence.

AMR-to-Text Generation Text Generation

Investigating Pretrained Language Models for Graph-to-Text Generation

UKPLab/plms-graph2text 16 Jul 2020

We show that the PLMs BART and T5 achieve new state-of-the-art results and that task-adaptive pretraining strategies improve their performance even further.

AMR-to-Text Generation Data-to-Text Generation +2

Structural Neural Encoders for AMR-to-text Generation

mdtux89/OpenNMT-py NAACL 2019

AMR-to-text generation is a problem recently introduced to the NLP community, in which the goal is to generate sentences from Abstract Meaning Representation (AMR) graphs.

AMR-to-Text Generation Graph-to-Sequence +1

Enhancing AMR-to-Text Generation with Dual Graph Representations

UKPLab/emnlp2019-dualgraph IJCNLP 2019

Generating text from graph-based data, such as Abstract Meaning Representation (AMR), is a challenging task due to the inherent difficulty in how to properly encode the structure of a graph with labeled edges.

AMR-to-Text Generation Data-to-Text Generation +1

Lightweight, Dynamic Graph Convolutional Networks for AMR-to-Text Generation

yanzhang92/LDGCNs EMNLP 2020

With the help of these strategies, we are able to train a model with fewer parameters while maintaining the model capacity.

AMR-to-Text Generation Text Generation