학술논문

Visible Light-Based Position and Pose Tracking: A Bidirectional Recurrent Convolutional Learning Method to Address Environment Dynamics
Document Type
Conference
Source
2022 IEEE/CIC International Conference on Communications in China (ICCC) Communications in China (ICCC), 2022 IEEE/CIC International Conference on. :559-564 Aug, 2022
Subject
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Fields, Waves and Electromagnetics
Signal Processing and Analysis
Performance evaluation
Optical filters
Three-dimensional displays
Convolution
Heuristic algorithms
Scattering
Receivers
Visible Light-Based Positioning
Spatial-Time Texture Feature
Sample Image
Recurrent Neural Network
Language
Abstract
In this paper, we are interested in visible light-based position and orientation tracking (VLP) for mobile user devices (UDs) in dynamic environments. Conventional model-based VLP usually depends on a perfect signal propagation model (SPM) with fixed parameters, and hence their performance will be seriously decreased when environment varies over time, e.g., due to diffuse scattering or receiver optical filter gain fluctuation. To address this challenge, in this paper we propose a bidirectional recurrent convolutional neural network (Bi-RCNN)-based VLP algorithm. Our Bi-RCNN extracts time-domain correlation feature of measurement sample series, and simultaneously spatial-domain texture feature is captured via 3D convolutional-driven memory networks. In this way, spatial-time texture features are fully exploited, and thus mobile UD tracking performance is improved. Numerical experiment validates that our Bi-RCNN-based VLP outperforms existing VLP baselines, and the achieved localization error is around 1.5 cm in dynamic environments.