BERT for Joint Intent Classification and Slot Filling

28 Feb 2019  ·  Qian Chen, Zhu Zhuo, Wen Wang ·

Intent classification and slot filling are two essential tasks for natural language understanding. They often suffer from small-scale human-labeled training data, resulting in poor generalization capability, especially for rare words. Recently a new language representation model, BERT (Bidirectional Encoder Representations from Transformers), facilitates pre-training deep bidirectional representations on large-scale unlabeled corpora, and has created state-of-the-art models for a wide variety of natural language processing tasks after simple fine-tuning. However, there has not been much effort on exploring BERT for natural language understanding. In this work, we propose a joint intent classification and slot filling model based on BERT. Experimental results demonstrate that our proposed model achieves significant improvement on intent classification accuracy, slot filling F1, and sentence-level semantic frame accuracy on several public benchmark datasets, compared to the attention-based recurrent neural network models and slot-gated models.

PDF Abstract

Datasets


Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Slot Filling ATIS Joint BERT F1 0.961 # 3
Intent Detection ATIS Joint BERT + CRF Accuracy 97.9 # 6
Intent Detection ATIS Joint BERT Accuracy 97.5 # 9
Slot Filling ATIS Joint BERT + CRF F1 0.96 # 6

Methods