학술논문

Leveraging Temporal Information for 3D Trajectory Estimation of Space Objects
Document Type
Conference
Source
2021 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW) ICCVW Computer Vision Workshops (ICCVW), 2021 IEEE/CVF International Conference on. :3809-3815 Oct, 2021
Subject
Computing and Processing
Space vehicles
Computer vision
Three-dimensional displays
Convolution
Conferences
Neural networks
Estimation
Language
ISSN
2473-9944
Abstract
This work presents a new temporally consistent space object 3D trajectory estimation from a video taken by a single RGB camera. Understanding space objects’ trajectories is an important component of Space Situational Awareness, especially for applications such as Active Debris Removal, On-orbit Servicing, and Orbital Maneuvers. Using only the information from a single image perspective gives temporally inconsistent 3D position estimation. Our approach operates in two subsequent stages. The first stage estimates the 2D location of the space object using a convolution neural network. In the next stage, the 2D locations are lifted to 3D space, using temporal convolution neural network that enforces the temporal coherence over the estimated 3D locations. Our results show that leveraging temporal information yields smooth and accurate 3D trajectory estimations for space objects. A dedicated large realistic synthetic dataset, named SPARK-T, containing 3 spacecrafts, under various sensing conditions, is also proposed and will be publicly shared with the research community.