Label-aware Document Representation via Hybrid Attention for Extreme Multi-Label Text Classification

24 May 2019  ยท  Xin Huang, Boli Chen, Lin Xiao, Liping Jing ยท

Extreme multi-label text classification (XMTC) aims at tagging a document with most relevant labels from an extremely large-scale label set. It is a challenging problem especially for the tail labels because there are only few training documents to build classifier. This paper is motivated to better explore the semantic relationship between each document and extreme labels by taking advantage of both document content and label correlation. Our objective is to establish an explicit label-aware representation for each document with a hybrid attention deep neural network model(LAHA). LAHA consists of three parts. The first part adopts a multi-label self-attention mechanism to detect the contribution of each word to labels. The second part exploits the label structure and document content to determine the semantic connection between words and labels in a same latent space. An adaptive fusion strategy is designed in the third part to obtain the final label-aware document representation so that the essence of previous two parts can be sufficiently integrated. Extensive experiments have been conducted on six benchmark datasets by comparing with the state-of-the-art methods. The results show the superiority of our proposed LAHA method, especially on the tail labels.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Multi-Label Text Classification AAPD LAHA P@1 84.48 # 2
P@3 60.72 # 2
P@5 41.19 # 2
nDCG@3 80.11 # 2
nDCG@5 83.7 # 2
Multi-Label Text Classification Amazon-12K LAHA P@1 94.87 # 1
P@3 79.16 # 1
P@5 63.16 # 1
nDCG@3 89.13 # 1
nDCG@5 87.57 # 1
Multi-Label Text Classification EUR-Lex LAHA nDCG@5 59.28 # 3
P@1 74.95 # 2
P@3 61.48 # 2
P@5 50.71 # 3
nDCG@3 64.89 # 2
Multi-Label Text Classification Kan-Shan Cup LAHA P@1 54.38 # 1
P@3 34.6 # 1
P@5 25.88 # 1
nDCG@3 51.7 # 1
nDCG@5 54.65 # 1
Multi-Label Text Classification Wiki-30K LAHA P@1 84.18 # 1
P@3 73.14 # 1
P@5 62.87 # 1
nDCG@3 75.64 # 1
nDCG@5 67.82 # 1

Methods


No methods listed for this paper. Add relevant methods here