Joint Forward-Backward Visual Odometry for Stereo Cameras

21 Dec 2019  ·  Raghav Sardana, Rahul Kottath, Vinod Karar, Shashi Poddar ·

Visual odometry is a widely used technique in the field of robotics and automation to keep a track on the location of a robot using visual cues alone. In this paper, we propose a joint forward backward visual odometry framework by combining both, the forward motion and backward motion estimated from stereo cameras. The basic framework of LIBVIOS2 is used here for pose estimation as it can run in real-time on standard CPUs. The complementary nature of errors in the forward and backward mode of visual odometry helps in providing a refined motion estimation upon combining these individual estimates. In addition, two reliability measures, that is, forward-backward relative pose error and forward-backward absolute pose error have been proposed for evaluating visual odometry frameworks on its own without the requirement of any ground truth data. The proposed scheme is evaluated on the KITTI visual odometry dataset. The experimental results demonstrate improved accuracy of the proposed scheme over the traditional odometry pipeline without much increase in the system overload.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here