Cross-Drone Transformer Network for Robust Single Object Tracking

Drones have been widely used in a variety of applications, e.g., aerial photography and military security, because of their high maneuverability and broad views compared with fixed cameras. Multi-drone tracking systems can provide rich information about targets by collecting complementary video clips from different views, especially when targets are occluded or disappear in some views. However, it is challenging to handle cross-drone information interaction and multi-drone information fusion in multi-drone visual tracking. Recently, Transformer has shown significant advantages in automatically modeling the correlation between templates and search regions for visual tracking. To leverage its potential in multi-drone tracking, we propose a novel cross-drone Transformer network (TransMDOT) for visual object tracking tasks. The self-attention mechanism is used to automatically capture the correlation between multiple templates and the corresponding search region to achieve multi-drone feature fusion. During the tracking process, a cross-drone mapping mechanism is proposed by using the surrounding information of the drone with promising tracking status as reference, assisting drones that lost targets to re-calibrate, which implements real-time cross-drone information interaction. As the existing multi-drone evaluation metrics only consider spatial information while ignore temporal information, we further present a system perception index (SPFI) that combines both temporal and spatial information to evaluate the tracking status of multiple drones. Experiments on the MDOT dataset prove that TransMDOT greatly surpasses the state-of-the-art methods in both single-drone performance and multi-drone system fusion performance. Our code will be available on https://github.com/cgjacklin/transmdot.

PDF

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods