|TREND||DATASET||BEST METHOD||PAPER TITLE||PAPER||CODE||COMPARE|
The problem of AMR-to-text generation is to recover a text representing the same meaning as an input AMR graph.
SOTA for Graph-to-Sequence on LDC2015E86: (using extra training data)
The dominant graph-to-sequence transduction models employ graph neural networks for graph representation learning, where the structural information is reflected by the receptive field of neurons.
AMR-to-text generation is a problem recently introduced to the NLP community, in which the goal is to generate sentences from Abstract Meaning Representation (AMR) graphs.
#2 best model for Graph-to-Sequence on LDC2015E86:
Generating text from graph-based data, such as Abstract Meaning Representation (AMR), is a challenging task due to the inherent difficulty in how to properly encode the structure of a graph with labeled edges.
#2 best model for Data-to-Text Generation on LDC2017T10