In this paper, we present BirdSLAM, a novel simultaneous localization and mapping (SLAM) system for the challenging scenario of autonomous driving platforms equipped with only a monocular camera.
To the best of our knowledge, we are the first to accurately forecast trajectories at a very high prediction rate of 78 trajectories per second on CPU.
In this paper, we tackle the problem of multibody SLAM from a monocular camera.
Uncharacteristic of state-of-the-art approaches, our representations and models generalize to completely different datasets, collected across several cities, and also across countries where people drive on opposite sides of the road (left-handed vs right-handed driving).
The proposed approach significantly improves the state-of-the-art for monocular object localization on arbitrarily-shaped roads.
This paper introduces geometry and object shape and pose costs for multi-object tracking in urban driving scenarios.
Ranked #2 on 3D Multi-Object Tracking on KITTI