학술논문

LiDAR-Stereo Thermal Sensor Fusion for Indoor Disaster Environment
Document Type
Periodical
Source
IEEE Sensors Journal IEEE Sensors J. Sensors Journal, IEEE. 23(7):7816-7827 Apr, 2023
Subject
Signal Processing and Analysis
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Robotics and Control Systems
Point cloud compression
Laser radar
Thermal sensors
Cameras
Sensors
Robot sensing systems
Robots
Indoor low-visibility disaster environment
light detection and ranging (LiDAR)
point cloud generation
sensor fusion
stereo thermal infrared cameras
Language
ISSN
1530-437X
1558-1748
2379-9153
Abstract
This article proposes a method of point cloud generation for indoor low-visibility disaster environments. Recently, robots have been developed to perform several missions in such environments, which are potentially harmful to humans. However, an indoor disaster environment often consists of a dense fog, which makes robot navigation challenging because widely used sensors [e.g., optical cameras and light detection and ranging (LiDAR)] cannot be used due to low visibility. Several methods have been used to address this problem. In this article, we propose a sensor-fusion method that can generate point clouds of uneven foggy indoor environments using LiDAR and stereo thermal infrared cameras. We generate point clouds using stereo depth estimation and process them to have the same angular resolution as a LiDAR point cloud. We then approximate them based on thermal edge information and finally integrate the LiDAR point cloud with fog points removed. Furthermore, we performed an indoor experiment and the results showed that the proposed method can generate applicable point clouds by applying conventional LiDAR odometry and mapping algorithms.