Towards Grand Unification of Object Tracking

14 Jul 2022  ·  Bin Yan, Yi Jiang, Peize Sun, Dong Wang, Zehuan Yuan, Ping Luo, Huchuan Lu ·

We present a unified method, termed Unicorn, that can simultaneously solve four tracking problems (SOT, MOT, VOS, MOTS) with a single network using the same model parameters. Due to the fragmented definitions of the object tracking problem itself, most existing trackers are developed to address a single or part of tasks and overspecialize on the characteristics of specific tasks. By contrast, Unicorn provides a unified solution, adopting the same input, backbone, embedding, and head across all tracking tasks. For the first time, we accomplish the great unification of the tracking network architecture and learning paradigm. Unicorn performs on-par or better than its task-specific counterparts in 8 tracking datasets, including LaSOT, TrackingNet, MOT17, BDD100K, DAVIS16-17, MOTS20, and BDD100K MOTS. We believe that Unicorn will serve as a solid step towards the general vision model. Code is available at https://github.com/MasterBin-IIAU/Unicorn.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Multiple Object Tracking BDD100K val Unicorn mMOTA 41.2 # 6
mIDF1 54.0 # 6
TETA - # 5
Multi-Object Tracking and Segmentation BDD100K val Unicorn mMOTSA 29.6 # 2
Visual Object Tracking LaSOT Unicorn AUC 68.5 # 26
Normalized Precision 76.6 # 23
Precision 74.1 # 20
Multi-Object Tracking MOT17 Unicorn MOTA 77.2 # 13
IDF1 75.5 # 15
HOTA 61.7 # 14
Multi-Object Tracking MOTS20 Unicorn sMOTSA 65.3 # 3
IDF1 65.9 # 2
Visual Object Tracking TrackingNet Unicorn Precision 82.2 # 14
Normalized Precision 86.4 # 20
Accuracy 83 # 18

Methods