The calibration of cameras and lidar is a crucial basic work in many tasks. The accuracy of calibration directly affects the upper limit of the effect of subsequent solution fusion. As many autonomous driving and robotics companies invest a lot of manpower and material resources to continuously improve the accuracy of calibration, today we will introduce you to some common Camera-Lidar calibration toolboxes, and we recommend that you collect them!
Multiple checkerboard detection in one shot: https://www.cvlibs.net/software/libcbdetect/
Use MATLAB The algorithm was written to automatically extract corner points and combine them with sub-pixel accuracy into a rectangular checkerboard-like pattern. This algorithm has the ability to process different types of images (such as pinhole cameras, fisheye cameras, panoramic cameras).
Autoware framework lidar-camera calibration tool kit.
Link: https://github.com/autowarefoundation/autoware_ai_utilities/tree/master/autoware_camera_lidar_calibrator
Link: https://github.com/Livox-SDK/livox_camera_lidar_calibration
Chinese documentation: https://github.com/Livox-SDK/livox_camera_lidar_calibration/blob/master/doc_resources/ README_cn.md
CalibrationTools provides calibration tools for lidar-lidar, lidar camera and other sensor pairs. In addition to this, it also provides:
1) Positioning - Bias estimation tool estimates the parameters of sensors used for dead reckoning (IMU and odometry) for better positioning performance!
2) Visualization and analysis tool for Autoware control output;
3) Calibration tool for fixing vehicle command delays;
Link: https://github.com /tier4/CalibrationTools
Universal: It can handle a variety of lidar and camera projection models, including rotation and Non-repetitive scanning lidar, as well as pinhole, fisheye and omnidirectional projection cameras. No target: It does not require calibration targets, but uses environment structures and textures for calibration. Single shot: Calibration only requires at least a pair of lidar point clouds and camera images. Optionally, multiple lidar camera data pairs can be used to improve accuracy. Automatic: The calibration process is automatic and requires no initial guessing. Accurate and robust: It uses a pixel-level direct lidar camera registration algorithm, which is more robust and accurate compared to edge-based indirect lidar camera registration.
Link: https://github.com/koide3/direct_visual_lidar_calibrationThe above is the detailed content of How are cameras and lidar calibrated? An overview of all mainstream calibration tools in the industry. For more information, please follow other related articles on the PHP Chinese website!