Dynamic Head: Unifying Object Detection Heads with Attentions

The complex nature of combining localization and classification in object detection has resulted in the flourished development of methods. Previous works tried to improve the performance in various object detection heads but failed to present a unified view. In this paper, we present a novel dynamic head framework to unify object detection heads with attentions. By coherently combining multiple self-attention mechanisms between feature levels for scale-awareness, among spatial locations for spatial-awareness, and within output channels for task-awareness, the proposed approach significantly improves the representation ability of object detection heads without any computational overhead. Further experiments demonstrate that the effectiveness and efficiency of the proposed dynamic head on the COCO benchmark. With a standard ResNeXt-101-DCN backbone, we largely improve the performance over popular object detectors and achieve a new state-of-the-art at 54.0 AP. Furthermore, with latest transformer backbone and extra data, we can push current best COCO result to a new record at 60.6 AP. The code will be released at https://github.com/microsoft/DynamicHead.

PDF Abstract CVPR 2021 PDF CVPR 2021 Abstract

Datasets


Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Uses Extra
Training Data
Result Benchmark
Object Detection COCO 2017 val DyHead (Swin-T, multi scale) AP50 68 # 4
AP75 54.3 # 1
APL 64.2 # 4
Object Detection COCO minival DyHead (Swin-L, multi scale) box AP 58.4 # 33
AP50 76.8 # 5
APS 44.5 # 3
APM 62.2 # 3
APL 73.2 # 5
Object Detection COCO minival DyHead (ResNeXt-64x4d-101-DCN, multi scale) APL 66.3 # 14
Object Detection COCO minival DyHead (ResNet-101) box AP 46.5 # 95
Object Detection COCO minival DyHead (Swin-L, multi scale, self-training) box AP 60.3 # 22
AP50 78.2 # 2
APL 74.2 # 3
Object Detection COCO-O DyHead (Swin-L) Average mAP 35.3 # 9
Effective Robustness 10.00 # 10
Object Detection COCO-O DyHead (ResNet-50) Average mAP 19.3 # 31
Effective Robustness 0.16 # 34
Object Detection COCO test-dev DyHead (ResNet-50) box mAP 43 # 161
AP50 60.7 # 124
AP75 46.8 # 105
Object Detection COCO test-dev DyHead (Swin-L, multi scale, self-training) box mAP 60.6 # 24
AP50 78.5 # 5
AP75 66.6 # 5
APM 64.0 # 5
APL 74.2 # 5
Object Detection COCO test-dev DyHead (Swin-L, multi scale) box mAP 58.7 # 31
AP50 77.1 # 6
AP75 64.5 # 6
APM 62.0 # 6
APL 72.8 # 6
Object Detection COCO test-dev DyHead (ResNeXt-64x4d-101) box mAP 47.7 # 112
AP50 65.7 # 71
AP75 51.9 # 60
Object Detection COCO test-dev DyHead (ResNeXt-64x4d-101-DCN, multi scale) box mAP 54 # 54
AP50 72.1 # 17
AP75 59.3 # 18
Object Detection COCO test-dev DyHead (ResNet-101) AP50 64.5 # 83
AP75 50.7 # 72

Methods


No methods listed for this paper. Add relevant methods here