Hand-guided 3D surface acquisition by combining simple light sectioning with real-time algorithms

9 Jan 2014  ·  Oliver Arold, Svenja Ettl, Florian Willomitzer, Gerd Häusler ·

Precise 3D measurements of rigid surfaces are desired in many fields of application like quality control or surgery. Often, views from all around the object have to be acquired for a full 3D description of the object surface. We present a sensor principle called "Flying Triangulation" which avoids an elaborate "stop-and-go" procedure. It combines a low-cost classical light-section sensor with an algorithmic pipeline. A hand-guided sensor captures a continuous movie of 3D views while being moved around the object. The views are automatically aligned and the acquired 3D model is displayed in real time. In contrast to most existing sensors no bandwidth is wasted for spatial or temporal encoding of the projected lines. Nor is an expensive color camera necessary for 3D acquisition. The achievable measurement uncertainty and lateral resolution of the generated 3D data is merely limited by physics. An alternating projection of vertical and horizontal lines guarantees the existence of corresponding points in successive 3D views. This enables a precise registration without surface interpolation. For registration, a variant of the iterative closest point algorithm - adapted to the specific nature of our 3D views - is introduced. Furthermore, data reduction and smoothing without losing lateral resolution as well as the acquisition and mapping of a color texture is presented. The precision and applicability of the sensor is demonstrated by simulation and measurement results.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here