Towards Grand Unification of Object Tracking

14 Jul 2022  ·  Bin Yan, Yi Jiang, Peize Sun, Dong Wang, Zehuan Yuan, Ping Luo, Huchuan Lu ·

We present a unified method, termed Unicorn, that can simultaneously solve four tracking problems (SOT, MOT, VOS, MOTS) with a single network using the same model parameters. Due to the fragmented definitions of the object tracking problem itself, most existing trackers are developed to address a single or part of tasks and overspecialize on the characteristics of specific tasks. By contrast, Unicorn provides a unified solution, adopting the same input, backbone, embedding, and head across all tracking tasks. For the first time, we accomplish the great unification of the tracking network architecture and learning paradigm. Unicorn performs on-par or better than its task-specific counterparts in 8 tracking datasets, including LaSOT, TrackingNet, MOT17, BDD100K, DAVIS16-17, MOTS20, and BDD100K MOTS. We believe that Unicorn will serve as a solid step towards the general vision model. Code is available at https://github.com/MasterBin-IIAU/Unicorn.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Multi-Object Tracking and Segmentation BDD100K val Unicorn mMOTSA 29.6 # 2
Multiple Object Tracking BDD100K val Unicorn mMOTA 41.2 # 6
mIDF1 54.0 # 6
Visual Object Tracking LaSOT Unicorn AUC 68.5 # 20
Normalized Precision 76.6 # 18
Precision 74.1 # 15
Multi-Object Tracking MOT17 Unicorn MOTA 77.2 # 11
IDF1 75.5 # 12
HOTA 61.7 # 11
Multi-Object Tracking MOTS20 Unicorn sMOTSA 65.3 # 3
IDF1 65.9 # 2
Visual Object Tracking TrackingNet Unicorn Precision 82.2 # 11
Normalized Precision 86.4 # 17
Accuracy 83 # 15

Methods