학술논문

Multi-Sensor Fusion Based Off-Road Drivable Region Detection and Its ROS Implementation
Document Type
Conference
Source
2023 International Conference on Wireless Communications Signal Processing and Networking (WiSPNET) Wireless Communications Signal Processing and Networking (WiSPNET), 2023 International Conference on. :1-5 Mar, 2023
Subject
Communication, Networking and Broadcast Technologies
Computing and Processing
Fields, Waves and Electromagnetics
Signal Processing and Analysis
Point cloud compression
Wireless communication
Laser radar
Fuses
Navigation
Semantic segmentation
Robot vision systems
off-road driving
image
LiDAR
multi-sensor fusion
ROS
Language
Abstract
There is a growing demand for multi-sensor fusion based off-road drivable region detection in the field of autonomous vehicles and robotics. This technology allows for improved navigation and localization in off-road environments, such as rough terrain, by combining data from multiple sensors. This can lead to more accurate and reliable detection of drivable regions, which is crucial for the safe operation of autonomous vehicles in off-road environments. In this work, a deep learning architecture is employed to identify drivable and obstacle regions on images. It learns to classify and cluster the regions simultaneously using semantic segmentation. Further, a LiDAR-based ground segmentation method is introduced to classify drivable regions more effectively. The ground segmentation method splits the regions into small bins and applies the ground fitting technique with adaptive likelihood estimation. Finally, a late fusion method is proposed to fuse both results better to classify the drivable region. The entire fusion architecture was implemented on ROS. On the RELLIS3D dataset, the semantic segmentation achieves a mean accuracy of 84.3%. Furthermore, it is observed that certain regions misclassified by the semantic segmentation are corrected by LiDAR-based ground segmentation and the fusion provides a better representation of the drivable region.