DNN Filter for Bias Reduction in Distribution-to-Distribution Scan Matching

8 Nov 2022  ·  Matthew McDermott, Jason Rife ·

Distribution-to-distribution (D2D) point cloud registration techniques such as the Normal Distributions Transform (NDT) can align point clouds sampled from unstructured scenes and provide accurate bounds of their own solution error covariance -- an important feature for safety-of-life navigation tasks. D2D methods rely on the assumption of a static scene and are therefore susceptible to bias from range-shadowing, self-occlusion, moving objects, and distortion artifacts as the recording device moves between frames. Deep Learning-based approaches can achieve higher accuracy in dynamic scenes by relaxing these constraints, however, DNNs produce uninterpretable solutions which can be problematic from a safety perspective. In this paper, we propose a method of down-sampling LIDAR point clouds to exclude voxels that violate the assumption of a static scene and introduce error to the D2D scan matching process. Our approach uses a solution consistency filter -- identifying and suppressing voxels where D2D contributions disagree with local estimates from a PointNet-based registration network. Our results show that this technique provides significant benefits in registration accuracy, and is particularly useful in scenes containing dense foliage.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods