Soft-Attention Improves Skin Cancer Classification Performance

5 May 2021  ·  Soumyya Kanti Datta, Mohammad Abuzar Shaikh, Sargur N. Srihari, Mingchen Gao ·

In clinical applications, neural networks must focus on and highlight the most important parts of an input image. Soft-Attention mechanism enables a neural network toachieve this goal. This paper investigates the effectiveness of Soft-Attention in deep neural architectures. The central aim of Soft-Attention is to boost the value of important features and suppress the noise-inducing features. We compare the performance of VGG, ResNet, InceptionResNetv2 and DenseNet architectures with and without the Soft-Attention mechanism, while classifying skin lesions. The original network when coupled with Soft-Attention outperforms the baseline[16] by 4.7% while achieving a precision of 93.7% on HAM10000 dataset [25]. Additionally, Soft-Attention coupling improves the sensitivity score by 3.8% compared to baseline[31] and achieves 91.6% on ISIC-2017 dataset [2]. The code is publicly available at github.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Lesion Classification HAM10000 IRv2+Soft Attention Accuracy 93.4 # 2
AUC 0.984 # 1
Average Precision 0.937 # 1
Lesion Classification ISIC 2017 IRv2+Soft Attention Accuracy 90.4 # 1
AUC 0.959 # 1
Lesion Classification ISIC 2017 ARL-CNN50[zhang2019attention] Accuracy 86.8 # 2
AUC 0.958 # 2
Lesion Classification ISIC 2017 SEnet50 [hu2018squeeze] Accuracy 86.3 # 3
Lesion Classification ISIC 2017 RAN50 [wang2017residual] Accuracy 86.2 # 4
AUC 0.942 # 4
Lesion Classification ISIC 2017 ResNet50 [he2016deep] Accuracy 84.2 # 5
AUC 0.948 # 3

Methods