The dominant graph-to-sequence transduction models employ graph neural networks for graph representation learning, where the structural information is reflected by the receptive field of neurons.
GRAPH REPRESENTATION LEARNING GRAPH-TO-SEQUENCE MACHINE TRANSLATION TEXT GENERATION
Generating text from graph-based data, such as Abstract Meaning Representation (AMR), is a challenging task due to the inherent difficulty in how to properly encode the structure of a graph with labeled edges.
We propose to split the generation process into a symbolic text-planning stage that is faithful to the input, followed by a neural generation stage that focuses only on realization.
#2 best model for
Data-to-Text Generation
on WebNLG
AMR-to-text generation is a problem recently introduced to the NLP community, in which the goal is to generate sentences from Abstract Meaning Representation (AMR) graphs.
#2 best model for
Graph-to-Sequence
on LDC2015E86:
Most previous work on neural text generation from graph-structured data relies on standard sequence-to-sequence methods.
SOTA for Data-to-Text Generation on WebNLG
Previous work approaches the SQL-to-text generation task using vanilla Seq2Seq models, which may not fully capture the inherent graph-structured information in SQL query.
Many NLP applications can be framed as a graph-to-sequence learning problem.
The problem of AMR-to-text generation is to recover a text representing the same meaning as an input AMR graph.
SOTA for Graph-to-Sequence on LDC2015E86: (using extra training data)
Our method first generates the node and graph embeddings using an improved graph-based neural network with a novel aggregation strategy to incorporate edge direction information in the node embeddings.
SOTA for SQL-to-Text on WikiSQL
Sequence-to-sequence models have shown strong performance across a broad range of applications.