Browse > Natural Language Processing > Graph-to-Sequence

Graph-to-Sequence

10 papers with code ยท Natural Language Processing

Leaderboards

TREND DATASET BEST METHOD PAPER TITLE PAPER CODE COMPARE

Latest papers without code

Equivalence of Dataflow Graphs via Rewrite Rules Using a Graph-to-Sequence Neural Model

17 Feb 2020

In this work we target the problem of provably computing the equivalence between two programs represented as dataflow graphs.

GRAPH-TO-SEQUENCE

Enhancing AMR-to-Text Generation with Dual Graph Representations

IJCNLP 2019

Generating text from graph-based data, such as Abstract Meaning Representation (AMR), is a challenging task due to the inherent difficulty in how to properly encode the structure of a graph with labeled edges.

GRAPH-TO-SEQUENCE TEXT GENERATION

Natural Question Generation with Reinforcement Learning Based Graph-to-Sequence Model

19 Oct 2019

Natural question generation (QG) aims to generate questions from a passage and an answer.

GRAPH-TO-SEQUENCE QUESTION GENERATION

DynGraph2Seq: Dynamic-Graph-to-Sequence Interpretable Learning for Health Stage Prediction in Online Health Forums

22 Aug 2019

In this paper, we first formulate the transition of user activities as a dynamic graph with multi-attributed nodes, then formalize the health stage inference task as a dynamic graph-to-sequence learning problem, and hence propose a novel dynamic graph-to-sequence neural networks architecture (DynGraph2Seq) to address all the challenges.

GRAPH-TO-SEQUENCE

Densely Connected Graph Convolutional Networks for Graph-to-Sequence Learning

TACL 2019

We focus on graph-to-sequence learning, which can be framed as transducing graph structures to sequences for text generation.

GRAPH-TO-SEQUENCE MACHINE TRANSLATION TEXT GENERATION

Reinforcement Learning Based Graph-to-Sequence Model for Natural Question Generation

14 Aug 2019

Natural question generation (QG) aims to generate questions from a passage and an answer.

GRAPH-TO-SEQUENCE QUESTION GENERATION

Coherent Comments Generation for Chinese Articles with a Graph-to-Sequence Model

ACL 2019

In this paper, we propose to generate comments with a graph-to-sequence model that models the input news as a topic interaction graph.

GRAPH-TO-SEQUENCE

Coherent Comment Generation for Chinese Articles with a Graph-to-Sequence Model

4 Jun 2019

In this paper, we propose to generate comments with a graph-to-sequence model that models the input news as a topic interaction graph.

GRAPH-TO-SEQUENCE

Structural Neural Encoders for AMR-to-text Generation

NAACL 2019

AMR-to-text generation is a problem recently introduced to the NLP community, in which the goal is to generate sentences from Abstract Meaning Representation (AMR) graphs.

GRAPH-TO-SEQUENCE TEXT GENERATION