학술논문

High-Precision Domain Adaptive Detection Method for Noncooperative Spacecraft Based on Optical Sensor Data
Document Type
Periodical
Source
IEEE Sensors Journal IEEE Sensors J. Sensors Journal, IEEE. 24(8):13604-13619 Apr, 2024
Subject
Signal Processing and Analysis
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Robotics and Control Systems
Space vehicles
YOLO
Feature extraction
Detectors
Lighting
Training
Task analysis
Deep learning
domain adaptation
noncooperative spacecraft
object detection
optical sensor data processing
Language
ISSN
1530-437X
1558-1748
2379-9153
Abstract
The accurate detection of noncooperative spacecraft based on optical sensor data is essential for critical space tasks, such as on-orbit servicing, rendezvous and docking, and debris removal. Traditional object detection methods struggle in the challenging space environment, which includes extreme variations in lighting, occlusions, and differences in image scale. To address this problem, this article proposes a high-precision, deep-learning-based, domain-adaptive detection method specifically tailored for noncooperative spacecraft. The proposed algorithm focuses on two key elements: dataset creation and network structure design. First, we develop a spacecraft image generation algorithm using cycle generative adversarial network (CycleGAN), facilitating seamless conversion between synthetic and real spacecraft images to bridge domain differences. Second, we combine a domain-adversarial neural network with YOLOv5 to create a robust detection model based on multiscale domain adaptation. This approach enhances the YOLOv5 network’s ability to learn domain-invariant features from both synthetic and real spacecraft images. The effectiveness of our high-precision domain-adaptive detection method is verified through extensive experimentation. This method enables several novel and significant space applications, such as space rendezvous and docking and on-orbit servicing.