|Trend||Dataset||Best Method||Paper title||Paper||Code||Compare|
The problem of AMR-to-text generation is to recover a text representing the same meaning as an input AMR graph.
SOTA for Graph-to-Sequence on LDC2015E86: (using extra training data)
Our method first generates the node and graph embeddings using an improved graph-based neural network with a novel aggregation strategy to incorporate edge direction information in the node embeddings.
SOTA for SQL-to-Text on WikiSQL
Most previous work on neural text generation from graph-structured data relies on standard sequence-to-sequence methods.
SOTA for Data-to-Text Generation on WebNLG
We propose to split the generation process into a symbolic text-planning stage that is faithful to the input, followed by a neural generation stage that focuses only on realization.
#2 best model for Data-to-Text Generation on WebNLG
Previous work approaches the SQL-to-text generation task using vanilla Seq2Seq models, which may not fully capture the inherent graph-structured information in SQL query.
AMR-to-text generation is a problem recently introduced to the NLP community, in which the goal is to generate sentences from Abstract Meaning Representation (AMR) graphs.
#2 best model for Graph-to-Sequence on LDC2015E86:
Generating text from graph-based data, such as Abstract Meaning Representation (AMR), is a challenging task due to the inherent difficulty in how to properly encode the structure of a graph with labeled edges.