Knowledge-Aware Graph-Enhanced GPT-2 for Dialogue State Tracking

EMNLP 2021  ·  Weizhe Lin, Bo-Hsiang Tseng, Bill Byrne ·

Dialogue State Tracking is central to multi-domain task-oriented dialogue systems, responsible for extracting information from user utterances. We present a novel hybrid architecture that augments GPT-2 with representations derived from Graph Attention Networks in such a way to allow causal, sequential prediction of slot values. The model architecture captures inter-slot relationships and dependencies across domains that otherwise can be lost in sequential prediction. We report improvements in state tracking performance in MultiWOZ 2.0 against a strong GPT-2 baseline and investigate a simplified sparse training scenario in which DST models are trained only on session-level annotations but evaluated at the turn level. We further report detailed analyses to demonstrate the effectiveness of graph models in DST by showing that the proposed graph modules capture inter-slot dependencies and improve the predictions of values that are common to multiple domains.

PDF Abstract EMNLP 2021 PDF EMNLP 2021 Abstract


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Multi-domain Dialogue State Tracking MULTIWOZ 2.0 L4P4K2-DSGraph Joint Acc 54.86 # 1
Slot Acc 97.47 # 1
Multi-domain Dialogue State Tracking MULTIWOZ 2.0 L4P4K2-DSVGraph Joint Acc 54.62 # 3
Slot Acc 97.42 # 2
Multi-domain Dialogue State Tracking MULTIWOZ 2.0 SimpleTOD-Reproduced Joint Acc 51.37 # 11
Multi-domain Dialogue State Tracking MULTIWOZ 2.0 DSTQA-Reproduced Joint Acc 52.24 # 7