학술논문

LiDAR and Camera Raw Data Sensor Fusion in Real-Time for Obstacle Detection
Document Type
Conference
Source
2023 IEEE Sensors Applications Symposium (SAS) Sensors Applications Symposium (SAS), 2023 IEEE. :1-6 Jul, 2023
Subject
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Power, Energy and Industry Applications
Robotics and Control Systems
Point cloud compression
Laser radar
Object detection
Sensor fusion
Cameras
Real-time systems
Sensors
LiDAR
Calibration
Sensor Fusion
Perception
Autonomous Vehicle
HD Map
Language
Abstract
Light Detection and Ranging (LiDAR) and camera are the most widely used sensors in autonomous vehicles for object detection, classification, localization, and Mapping. This paper proposes the fusion of raw data from LiDAR and Camera sensor in real-time. Multiple numbers of sample data are taken to calculate the intrinsic and extrinsic calibration parameters so that fusion will work on the real-time data with minimal projection error. Most of the research work done in LiDAR and Camera data fusion does not project point clouds on images in real-time data. Colored point cloud obtained by back projection can be used for constructing the High-Definition (HD) map. We need not rely on one sensor as it is easily prone to commit mistakes in perception. In this paper, The point cloud data of LiDAR is projected on the image, and using back projection, color information is provided to point cloud data to get the colored point cloud. The data of LiDAR and the camera is fused in real-time to perform the classification task using the camera and depth detection using LiDAR. The raw data from both sensors are projected with a rate much higher than the data acquisition rate so that data processing and perception algorithms to detect obstacles are done in real-time for the autonomous vehicle.