Browse > AMR-to-Text Generation

AMR-to-Text Generation

5 papers with code ·

Leaderboards

Greatest papers with code

A Graph-to-Sequence Model for AMR-to-Text Generation

ACL 2018 freesunshine0316/neural-graph-to-seq-mp

The problem of AMR-to-text generation is to recover a text representing the same meaning as an input AMR graph.

 SOTA for Graph-to-Sequence on LDC2015E86: (using extra training data)

AMR-TO-TEXT GENERATION GRAPH-TO-SEQUENCE TEXT GENERATION

Graph Transformer for Graph-to-Sequence Learning

18 Nov 2019jcyk/gtos

The dominant graph-to-sequence transduction models employ graph neural networks for graph representation learning, where the structural information is reflected by the receptive field of neurons.

AMR-TO-TEXT GENERATION GRAPH REPRESENTATION LEARNING GRAPH-TO-SEQUENCE MACHINE TRANSLATION TEXT GENERATION

Structural Neural Encoders for AMR-to-text Generation

NAACL 2019 mdtux89/OpenNMT-py

AMR-to-text generation is a problem recently introduced to the NLP community, in which the goal is to generate sentences from Abstract Meaning Representation (AMR) graphs.

AMR-TO-TEXT GENERATION GRAPH-TO-SEQUENCE TEXT GENERATION

Enhancing AMR-to-Text Generation with Dual Graph Representations

IJCNLP 2019 UKPLab/emnlp2019-dualgraph

Generating text from graph-based data, such as Abstract Meaning Representation (AMR), is a challenging task due to the inherent difficulty in how to properly encode the structure of a graph with labeled edges.

AMR-TO-TEXT GENERATION DATA-TO-TEXT GENERATION GRAPH-TO-SEQUENCE