Slot-Gated Modeling for Joint Slot Filling and Intent Prediction

NAACL 2018  ·  Chih-Wen Goo, Guang Gao, Yun-Kai Hsu, Chih-Li Huo, Tsung-Chieh Chen, Keng-Wei Hsu, Yun-Nung Chen ·

Attention-based recurrent neural network models for joint intent detection and slot filling have achieved the state-of-the-art performance, while they have independent attention weights. Considering that slot and intent have the strong relationship, this paper proposes a slot gate that focuses on learning the relationship between intent and slot attention vectors in order to obtain better semantic frame results by the global optimization... The experiments show that our proposed model significantly improves sentence-level semantic frame accuracy with 4.2{\%} and 1.9{\%} relative improvement compared to the attentional model on benchmark ATIS and Snips datasets respectively read more

PDF Abstract

Datasets


Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Intent Detection ATIS Slot-Gated BLSTM with Attension Accuracy 94.10 # 10
F1 95.20 # 6
Intent Detection SNIPS Slot-Gated BLSTM with Attension Intent Accuracy 97.00 # 6
Slot F1 Score 88.80 # 7

Methods


No methods listed for this paper. Add relevant methods here