SLIM: Explicit Slot-Intent Mapping with BERT for Joint Multi-Intent Detection and Slot Filling

26 Aug 2021  ·  Fengyu Cai, Wanhao Zhou, Fei Mi, Boi Faltings ·

Utterance-level intent detection and token-level slot filling are two key tasks for natural language understanding (NLU) in task-oriented systems. Most existing approaches assume that only a single intent exists in an utterance. However, there are often multiple intents within an utterance in real-life scenarios. In this paper, we propose a multi-intent NLU framework, called SLIM, to jointly learn multi-intent detection and slot filling based on BERT. To fully exploit the existing annotation data and capture the interactions between slots and intents, SLIM introduces an explicit slot-intent classifier to learn the many-to-one mapping between slots and intents. Empirical results on three public multi-intent datasets demonstrate (1) the superior performance of SLIM compared to the current state-of-the-art for NLU with multiple intents and (2) the benefits obtained from the slot-intent classifier.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Semantic Frame Parsing MixATIS SLIM Accuracy 47.6 # 13
Intent Detection MixATIS SLIM Accuracy 78.3 # 10
Slot Filling MixATIS SLIM Micro F1 88.5 # 7
Slot Filling MixSNIPS SLIM Micro F1 96.5 # 3
Semantic Frame Parsing MixSNIPS SLIM Accuracy 84.0 # 5
Intent Detection MixSNIPS SLIM Accuracy 97.2 # 8

Methods