Paper

SensorX2car: Sensors-to-car calibration for autonomous driving in road scenarios

Properly-calibrated sensors are the prerequisite for a dependable autonomous driving system. However, most prior methods focus on extrinsic calibration between sensors, and few focus on the misalignment between the sensors and the vehicle coordinate system. Existing targetless approaches rely on specific prior knowledge, such as driving routes and road features, to handle this misalignment. This work removes these limitations and proposes more general calibration methods for four commonly used sensors: Camera, LiDAR, GNSS/INS, and millimeter-wave Radar. By utilizing sensor-specific patterns: image feature, 3D LiDAR points, GNSS/INS solved pose, and radar speed, we design four corresponding methods to mainly calibrate the rotation from sensor to car during normal driving within minutes, composing a toolbox named SensorX2car. Real-world and simulated experiments demonstrate the practicality of our proposed methods. Meanwhile, the related codes have been open-sourced to benefit the community. To the best of our knowledge, SensorX2car is the first open-source sensor-to-car calibration toolbox. The code is available at https://github.com/OpenCalib/SensorX2car.

Results in Papers With Code
(↓ scroll down to see all results)