RigidFusion: Robot Localisation and Mapping in Environments with Large Dynamic Rigid Objects

21 Oct 2020  ·  Ran Long, Christian Rauch, Tianwei Zhang, Vladimir Ivan, Sethu Vijayakumar ·

This work presents a novel RGB-D SLAM approach to simultaneously segment, track and reconstruct the static background and large dynamic rigid objects that can occlude major portions of the camera view. Previous approaches treat dynamic parts of a scene as outliers and are thus limited to a small amount of changes in the scene, or rely on prior information for all objects in the scene to enable robust camera tracking. Here, we propose to treat all dynamic parts as one rigid body and simultaneously segment and track both static and dynamic components. We, therefore, enable simultaneous localisation and reconstruction of both the static background and rigid dynamic components in environments where dynamic objects cause large occlusion. We evaluate our approach on multiple challenging scenes with large dynamic occlusion. The evaluation demonstrates that our approach achieves better motion segmentation, localisation and mapping without requiring prior knowledge of the dynamic object's shape and appearance.

PDF Abstract
No code implementations yet. Submit your code now

Categories


Robotics

Datasets


  Add Datasets introduced or used in this paper