Graph-to-Sequence
28 papers with code • 2 benchmarks • 4 datasets
Mapping an input graph to a sequence of vectors.
Libraries
Use these libraries to find Graph-to-Sequence models and implementationsMost implemented papers
Neural AMR: Sequence-to-Sequence Models for Parsing and Generation
Sequence-to-sequence models have shown strong performance across a broad range of applications.
Graph2Seq: Graph to Sequence Learning with Attention-based Neural Networks
Our method first generates the node and graph embeddings using an improved graph-based neural network with a novel aggregation strategy to incorporate edge direction information in the node embeddings.
Graph-to-Sequence Learning using Gated Graph Neural Networks
Many NLP applications can be framed as a graph-to-sequence learning problem.
Deep Graph Convolutional Encoders for Structured Data to Text Generation
Most previous work on neural text generation from graph-structured data relies on standard sequence-to-sequence methods.
Structural Neural Encoders for AMR-to-text Generation
AMR-to-text generation is a problem recently introduced to the NLP community, in which the goal is to generate sentences from Abstract Meaning Representation (AMR) graphs.
Physics-informed neural networks for corrosion-fatigue prognosis
The result is a cumulative damage model where the physics-informed layers are used to model the relatively well understood physics (crack growth through Paris law) and the data-driven layers account for the hard to model effects (bias in damage accumulation due to corrosion).
A physics-informed neural network for wind turbine main bearing fatigue
Unexpected main bearing failure on a wind turbine causes unwanted maintenance and increased operation costs (mainly due to crane, parts, labor, and production loss).
A Graph-to-Sequence Model for AMR-to-Text Generation
The problem of AMR-to-text generation is to recover a text representing the same meaning as an input AMR graph.
Exploiting Rich Syntactic Information for Semantic Parsing with Graph-to-Sequence Model
Existing neural semantic parsers mainly utilize a sequence encoder, i. e., a sequential LSTM, to extract word order features while neglecting other valuable syntactic information such as dependency graph or constituent trees.
SQL-to-Text Generation with Graph-to-Sequence Model
Previous work approaches the SQL-to-text generation task using vanilla Seq2Seq models, which may not fully capture the inherent graph-structured information in SQL query.