Graph-to-Sequence

26 papers with code • 2 benchmarks • 3 datasets

Mapping an input graph to a sequence of vectors.

Libraries

Use these libraries to find Graph-to-Sequence models and implementations
3 papers
77

Latest papers with no code

Heterogeneous Graph Transformer for Graph-to-Sequence Learning

no code yet • ACL 2020

The graph-to-sequence (Graph2Seq) learning aims to transduce graph-structured representations to word sequences for text generation.

Learning to Encode Evolutionary Knowledge for Automatic Commenting Long Novels

no code yet • 21 Apr 2020

Static knowledge graph has been incorporated extensively into sequence-to-sequence framework for text generation.

GraphTTS: graph-to-sequence modelling in neural text-to-speech

no code yet • 4 Mar 2020

This paper leverages the graph-to-sequence method in neural text-to-speech (GraphTTS), which maps the graph embedding of the input sequence to spectrograms.

Equivalence of Dataflow Graphs via Rewrite Rules Using a Graph-to-Sequence Neural Model

no code yet • 17 Feb 2020

In this work we target the problem of provably computing the equivalence between two programs represented as dataflow graphs.

AMR-To-Text Generation with Graph Transformer

no code yet • TACL 2020

Abstract meaning representation (AMR)-to-text generation is the challenging task of generating natural language texts from AMR graphs, where nodes represent concepts and edges denote relations.

DynGraph2Seq: Dynamic-Graph-to-Sequence Interpretable Learning for Health Stage Prediction in Online Health Forums

no code yet • 22 Aug 2019

In this paper, we first formulate the transition of user activities as a dynamic graph with multi-attributed nodes, then formalize the health stage inference task as a dynamic graph-to-sequence learning problem, and hence propose a novel dynamic graph-to-sequence neural networks architecture (DynGraph2Seq) to address all the challenges.