Graph-to-Sequence

23 papers with code • 2 benchmarks • 2 datasets

Mapping an input graph to a sequence of vectors.

Datasets


Greatest papers with code

Graph2Seq: Graph to Sequence Learning with Attention-based Neural Networks

IBM/Graph2Seq ICLR 2019

Our method first generates the node and graph embeddings using an improved graph-based neural network with a novel aggregation strategy to incorporate edge direction information in the node embeddings.

Graph-to-Sequence SQL-to-Text +1

Coherent Comments Generation for Chinese Articles with a Graph-to-Sequence Model

lancopku/Graph-to-seq-comment-generation ACL 2019

In this paper, we propose to generate comments with a graph-to-sequence model that models the input news as a topic interaction graph.

Graph-to-Sequence

Coherent Comment Generation for Chinese Articles with a Graph-to-Sequence Model

lancopku/Graph-to-seq-comment-generation 4 Jun 2019

In this paper, we propose to generate comments with a graph-to-sequence model that models the input news as a topic interaction graph.

Graph-to-Sequence

Graph Transformer for Graph-to-Sequence Learning

jcyk/gtos 18 Nov 2019

The dominant graph-to-sequence transduction models employ graph neural networks for graph representation learning, where the structural information is reflected by the receptive field of neurons.

AMR-to-Text Generation Graph Representation Learning +3

Deep Graph Convolutional Encoders for Structured Data to Text Generation

diegma/graph-2-text WS 2018

Most previous work on neural text generation from graph-structured data relies on standard sequence-to-sequence methods.

Data-to-Text Generation Graph-to-Sequence

A Graph-to-Sequence Model for AMR-to-Text Generation

freesunshine0316/neural-graph-to-seq-mp ACL 2018

The problem of AMR-to-text generation is to recover a text representing the same meaning as an input AMR graph.

 Ranked #1 on Graph-to-Sequence on LDC2015E86: (using extra training data)

AMR-to-Text Generation Graph-to-Sequence +1

Step-by-Step: Separating Planning from Realization in Neural Data-to-Text Generation

AmitMY/chimera NAACL 2019

We propose to split the generation process into a symbolic text-planning stage that is faithful to the input, followed by a neural generation stage that focuses only on realization.

Data-to-Text Generation Graph-to-Sequence