AMR-to-Text Generation

15 papers with code • 5 benchmarks • 5 datasets

Abstract Meaning Representation (AMR) is a directed graph of labeled concepts and relations that captures sentence semantics. The propositional meaning behind its concepts abstracts away lexical properties. AMR is tree-like in structure as it has a single root node and few children with multiple parents. The goal of AMR-to-Text Generation is to recover the original sentence realization given an AMR. This task can be seen as the reverse of the structured prediction found in AMR parsing.

Source: AMR-to-Text Generation with Cache Transition Systems

Graph Pre-training for AMR Parsing and Generation

goodbai-nlp/amrbart ACL 2022

To our knowledge, we are the first to consider pre-training on semantic graphs.

89
15 Mar 2022

Smelting Gold and Silver for Improved Multilingual AMR-to-Text Generation

ukplab/m-amr2text EMNLP 2021

Recent work on multilingual AMR-to-text generation has exclusively focused on data augmentation strategies that utilize silver AMR.

6
08 Sep 2021

One SPRING to Rule Them Both: Symmetric AMR Semantic Parsing and Generation without a Complex Pipeline

SapienzaNLP/spring Proceedings of the AAAI Conference on Artificial Intelligence 2021

In Text-to-AMR parsing, current state-of-the-art semantic parsers use cumbersome pipelines integrating several different modules or components, and exploit graph recategorization, i. e., a set of content-specific heuristics that are developed on the basis of the training set.

121
18 May 2021

Structural Adapters in Pretrained Language Models for AMR-to-text Generation

ukplab/structadapt EMNLP 2021

Pretrained language models (PLM) have recently advanced graph-to-text generation, where the input graph is linearized into a sequence and fed into the PLM to obtain its representation.

29
16 Mar 2021

Lightweight, Dynamic Graph Convolutional Networks for AMR-to-Text Generation

yanzhang92/LDGCNs EMNLP 2020

With the help of these strategies, we are able to train a model with fewer parameters while maintaining the model capacity.

9
09 Oct 2020

Online Back-Parsing for AMR-to-Text Generation

muyeby/AMR-Backparsing EMNLP 2020

AMR-to-text generation aims to recover a text containing the same meaning as an input AMR graph.

6
09 Oct 2020

Investigating Pretrained Language Models for Graph-to-Text Generation

bjascob/amrlib EMNLP (NLP4ConvAI) 2021

We show that the PLMs BART and T5 achieve new state-of-the-art results and that task-adaptive pretraining strategies improve their performance even further.

210
16 Jul 2020

GPT-too: A language-model-first approach for AMR-to-text generation

IBM/GPT-too-AMR2text ACL 2020

Meaning Representations (AMRs) are broad-coverage sentence-level semantic graphs.

38
18 May 2020

Have Your Text and Use It Too! End-to-End Neural Data-to-Text Generation with Semantic Fidelity

amazon-research/datatuner COLING 2020

Our generated text has a significantly better semantic fidelity than the state of the art across all four datasets

92
08 Apr 2020

Graph Transformer for Graph-to-Sequence Learning

jcyk/gtos 18 Nov 2019

The dominant graph-to-sequence transduction models employ graph neural networks for graph representation learning, where the structural information is reflected by the receptive field of neurons.

191
18 Nov 2019