CAE: Mechanism to Diminish the Class Imbalanced in SLU Slot Filling Task

Spoken Language Understanding (SLU) task is a wide application task in Natural Language Processing. In the success of the pre-trained BERT model, NLU is addressed by Intent Classification and Slot Filling task with significant improvement performance. However, classed imbalance problem in NLU has not been carefully investigated, while this problem in Semantic Parsing datasets is frequent. Therefore, this work focuses on diminishing this problem. We proposed a BERT-based architecture named JointBERT Classify Anonymous Entity (JointBERT-CAE) that improves the performance of the system on three Semantic Parsing datasets ATIS, Snips, ATIS Vietnamese, and a well-known Named Entity Recognize (NER) dataset CoNLL2003. In JointBERT-CAE architecture, we use multitask joint-learning to split conventional Slot Filling task into two sub-task, detect Anonymous Entity by Sequence tagging and Classify recognized anonymous entities tasks. The experimental results show the solid improvement of JointBERT-CAE when compared with BERT on all datasets, as well as the wide applicable capacity to other NLP tasks using the Sequence Tagging technique.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Intent Detection ATIS JointBERT-CAE Accuracy 97.5 # 9
Slot Filling ATIS JointBERT-CAE F1 0.961 # 3
Intent Detection ATIS (vi) JointBERT-CAE Intent Accuracy 97.7 # 1
Intent Classification and Slot Filling ATIS (vi) JointBERT-CAE Slot F1 95.5 # 1
Intent Accuracy 97.7 # 1
Exact Match (EM) 87.9 # 1
Slot Filling ATIS (vi) JointBERT-CAE Slot F1 95.5 # 1
Intent Detection SNIPS JointBERT-CAE Intent Accuracy 98.3 # 3
Slot F1 Score 97.0 # 2
Slot Filling SNIPS JointBERT-CAE F1 0.97 # 2

Methods