|TREND||DATASET||BEST METHOD||PAPER TITLE||PAPER||CODE||COMPARE|
Our method first generates the node and graph embeddings using an improved graph-based neural network with a novel aggregation strategy to incorporate edge direction information in the node embeddings.
SOTA for SQL-to-Text on WikiSQL
The problem of AMR-to-text generation is to recover a text representing the same meaning as an input AMR graph.
SOTA for Graph-to-Sequence on LDC2015E86: (using extra training data)
Sequence-to-sequence models have shown strong performance across a broad range of applications.
Many NLP applications can be framed as a graph-to-sequence learning problem.
We propose to split the generation process into a symbolic text-planning stage that is faithful to the input, followed by a neural generation stage that focuses only on realization.
#6 best model for Data-to-Text Generation on WebNLG
The dominant graph-to-sequence transduction models employ graph neural networks for graph representation learning, where the structural information is reflected by the receptive field of neurons.
Natural question generation (QG) aims to generate questions from a passage and an answer.
AMR-to-text generation is a problem recently introduced to the NLP community, in which the goal is to generate sentences from Abstract Meaning Representation (AMR) graphs.
#2 best model for Graph-to-Sequence on LDC2015E86: