학술논문

Single Frame Lidar-Camera Calibration Using Registration of 3D Planes
Document Type
Conference
Source
2022 Sixth IEEE International Conference on Robotic Computing (IRC) IRC Robotic Computing (IRC), 2022 Sixth IEEE International Conference on. :395-402 Dec, 2022
Subject
Computing and Processing
Robotics and Control Systems
Point cloud compression
Three-dimensional displays
Laser radar
Transmission line matrix methods
Robot vision systems
Cameras
Real-time systems
lidar and camera calibration
correntropy
iterative closest point
sensor calibration
Language
Abstract
This work focuses on finding the extrinsic parameters (rotation and translation) between Lidar and an RGB camera sensor. We use a planar checkerboard and place it inside the Field-of-View (FOV) of both sensors, where we extract the 3D plane information of the checkerboard acquired from the sensor’s data. The plane coefficients extracted from the sensor’s data are used to construct a well-structured set of 3D points. These 3D points are then ’aligned,’ which gives the relative transformation between the two sensors. We use our proposed Correntropy Similarity Matrix Iterative Closest Point (CoSMICP) Algorithm to estimate the relative transformation. This work uses a single frame of the point cloud data acquired from the Lidar sensor and a single frame from the calibrated camera data to perform this operation. From the camera image, we use the projection of the calibration target’s corner points to compute the 3D points, and along the process, we calculate the 3D plane equation using the corner points. We evaluate our approach on a simulated dataset with complex environment settings, making use of the freedom to assess under multiple configurations. Through the obtained results, we verify our method under various configurations.