Graph-to-Sequence
26 papers with code • 2 benchmarks • 3 datasets
Mapping an input graph to a sequence of vectors.
Libraries
Use these libraries to find Graph-to-Sequence models and implementationsMost implemented papers
Fleet Prognosis with Physics-informed Recurrent Neural Networks
The results demonstrate that our proposed hybrid physics-informed recurrent neural network is able to accurately model fatigue crack growth even when the observed distribution of crack length does not match with the (unobservable) fleet distribution.
Step-by-Step: Separating Planning from Realization in Neural Data-to-Text Generation
We propose to split the generation process into a symbolic text-planning stage that is faithful to the input, followed by a neural generation stage that focuses only on realization.
STG2Seq: Spatial-temporal Graph to Sequence Model for Multi-step Passenger Demand Forecasting
Multi-step passenger demand forecasting is a crucial task in on-demand vehicle sharing services.
Coherent Comment Generation for Chinese Articles with a Graph-to-Sequence Model
In this paper, we propose to generate comments with a graph-to-sequence model that models the input news as a topic interaction graph.
Coherent Comments Generation for Chinese Articles with a Graph-to-Sequence Model
In this paper, we propose to generate comments with a graph-to-sequence model that models the input news as a topic interaction graph.
Reinforcement Learning Based Graph-to-Sequence Model for Natural Question Generation
Natural question generation (QG) aims to generate questions from a passage and an answer.
Densely Connected Graph Convolutional Networks for Graph-to-Sequence Learning
We focus on graph-to-sequence learning, which can be framed as transducing graph structures to sequences for text generation.
Enhancing AMR-to-Text Generation with Dual Graph Representations
Generating text from graph-based data, such as Abstract Meaning Representation (AMR), is a challenging task due to the inherent difficulty in how to properly encode the structure of a graph with labeled edges.
Natural Question Generation with Reinforcement Learning Based Graph-to-Sequence Model
Natural question generation (QG) aims to generate questions from a passage and an answer.
Graph Transformer for Graph-to-Sequence Learning
The dominant graph-to-sequence transduction models employ graph neural networks for graph representation learning, where the structural information is reflected by the receptive field of neurons.