Paper

3D Object Localization Using 2D Estimates for Computer Vision Applications

A technique for object localization based on pose estimation and camera calibration is presented. The 3-dimensional (3D) coordinates are estimated by collecting multiple 2-dimensional (2D) images of the object and are utilized for the calibration of the camera. The calibration steps involving a number of parameter calculation including intrinsic and extrinsic parameters for the removal of lens distortion, computation of object's size and camera's position calculation are discussed. A transformation strategy to estimate the 3D pose using the 2D images is presented. The proposed method is implemented on MATLAB and validation experiments are carried out for both pose estimation and camera calibration.

Results in Papers With Code
(↓ scroll down to see all results)