Graph-to-Sequence
26 papers with code • 2 benchmarks • 3 datasets
Mapping an input graph to a sequence of vectors.
Libraries
Use these libraries to find Graph-to-Sequence models and implementationsLatest papers
Enhancing AMR-to-Text Generation with Dual Graph Representations
Generating text from graph-based data, such as Abstract Meaning Representation (AMR), is a challenging task due to the inherent difficulty in how to properly encode the structure of a graph with labeled edges.
Densely Connected Graph Convolutional Networks for Graph-to-Sequence Learning
We focus on graph-to-sequence learning, which can be framed as transducing graph structures to sequences for text generation.
Reinforcement Learning Based Graph-to-Sequence Model for Natural Question Generation
Natural question generation (QG) aims to generate questions from a passage and an answer.
Coherent Comments Generation for Chinese Articles with a Graph-to-Sequence Model
In this paper, we propose to generate comments with a graph-to-sequence model that models the input news as a topic interaction graph.
Coherent Comment Generation for Chinese Articles with a Graph-to-Sequence Model
In this paper, we propose to generate comments with a graph-to-sequence model that models the input news as a topic interaction graph.
STG2Seq: Spatial-temporal Graph to Sequence Model for Multi-step Passenger Demand Forecasting
Multi-step passenger demand forecasting is a crucial task in on-demand vehicle sharing services.
Step-by-Step: Separating Planning from Realization in Neural Data-to-Text Generation
We propose to split the generation process into a symbolic text-planning stage that is faithful to the input, followed by a neural generation stage that focuses only on realization.
Structural Neural Encoders for AMR-to-text Generation
AMR-to-text generation is a problem recently introduced to the NLP community, in which the goal is to generate sentences from Abstract Meaning Representation (AMR) graphs.
Fleet Prognosis with Physics-informed Recurrent Neural Networks
The results demonstrate that our proposed hybrid physics-informed recurrent neural network is able to accurately model fatigue crack growth even when the observed distribution of crack length does not match with the (unobservable) fleet distribution.
Deep Graph Convolutional Encoders for Structured Data to Text Generation
Most previous work on neural text generation from graph-structured data relies on standard sequence-to-sequence methods.