Graph-to-Sequence
26 papers with code • 2 benchmarks • 3 datasets
Mapping an input graph to a sequence of vectors.
Libraries
Use these libraries to find Graph-to-Sequence models and implementationsLatest papers with no code
Heterogeneous Graph Transformer for Graph-to-Sequence Learning
The graph-to-sequence (Graph2Seq) learning aims to transduce graph-structured representations to word sequences for text generation.
Learning to Encode Evolutionary Knowledge for Automatic Commenting Long Novels
Static knowledge graph has been incorporated extensively into sequence-to-sequence framework for text generation.
GraphTTS: graph-to-sequence modelling in neural text-to-speech
This paper leverages the graph-to-sequence method in neural text-to-speech (GraphTTS), which maps the graph embedding of the input sequence to spectrograms.
Equivalence of Dataflow Graphs via Rewrite Rules Using a Graph-to-Sequence Neural Model
In this work we target the problem of provably computing the equivalence between two programs represented as dataflow graphs.
AMR-To-Text Generation with Graph Transformer
Abstract meaning representation (AMR)-to-text generation is the challenging task of generating natural language texts from AMR graphs, where nodes represent concepts and edges denote relations.
DynGraph2Seq: Dynamic-Graph-to-Sequence Interpretable Learning for Health Stage Prediction in Online Health Forums
In this paper, we first formulate the transition of user activities as a dynamic graph with multi-attributed nodes, then formalize the health stage inference task as a dynamic graph-to-sequence learning problem, and hence propose a novel dynamic graph-to-sequence neural networks architecture (DynGraph2Seq) to address all the challenges.