BiBL: AMR Parsing and Generation with Bidirectional Bayesian Learning

COLING 2022  ·  Ziming Cheng, Zuchao Li, Hai Zhao ·

Abstract Meaning Representation (AMR) offers a unified semantic representation for natural language sentences. Thus transformation between AMR and text yields two transition tasks in opposite directions, i.e., Text-to-AMR parsing and AMR-to-Text generation. Existing AMR studies only focus on one-side improvements despite the duality of the two tasks, and their improvements are greatly attributed to the inclusion of large extra training data or complex structure modifications which harm the inference speed. Instead, we propose data-efficient Bidirectional Bayesian learning (BiBL) to facilitate bidirectional information transition by adopting a single-stage multitasking strategy so that the resulting model may enjoy much lighter training at the same time. Evaluation on benchmark datasets shows that our proposed BiBL outperforms strong previous seq2seq refinements without the help of extra data which is indispensable in existing counterpart models. We release the codes of BiBL at:

PDF Abstract

Results from the Paper

 Ranked #1 on AMR-to-Text Generation on LDC2017T10 (using extra training data)

     Get a GitHub badge
Task Dataset Model Metric Name Metric Value Global Rank Uses Extra
Training Data
AMR-to-Text Generation LDC2017T10 BiBL+Silver BLEU 51.5 # 1
METEOR 45.2 # 1
ChrF++ 77.6 # 1
AMR-to-Text Generation LDC2017T10 BiBL BLEU 47.0 # 4
METEOR 43.2 # 2
ChrF++ 74.8 # 3
AMR Parsing LDC2017T10 BiBL+Silver Smatch 84.7 # 9
AMR Parsing LDC2017T10 BiBL Smatch 84.6 # 12
AMR Parsing LDC2020T02 BiBL+Silver Smatch 83.5 # 9
AMR Parsing LDC2020T02 BiBL Smatch 83.9 # 8
AMR-to-Text Generation LDC2020T02 BiBL BLEU 47.4 # 4
ChrF++ 74.5 # 3
METEOR 43.4 # 2
AMR-to-Text Generation LDC2020T02 BiBL+Silver BLEU 50.7 # 1
ChrF++ 76.7 # 1
METEOR 45.0 # 1


No methods listed for this paper. Add relevant methods here