학술논문

Using drones as reference sensors for neural-networks-based modeling of automotive perception errors
Document Type
Conference
Source
2020 IEEE Intelligent Vehicles Symposium (IV) Intelligent Vehicles Symposium (IV), 2020 IEEE. :708-715 Oct, 2020
Subject
Computing and Processing
Robotics and Control Systems
Signal Processing and Analysis
Transportation
Sensors
Sensor systems
Roads
Drones
Laser radar
Data models
Computational modeling
Language
ISSN
2642-7214
Abstract
Modeling perception errors of automated vehicles requires reference data, but common reference measurement methods either cannot capture uninstructed road users or suffer from vehicle-vehicle-occlusions. Therefore, we propose a method based on a camera-equipped drone hovering over the field of view of the perception system that is to be modeled. From recordings of this advantageous perspective, computer vision algorithms extract object tracks suited as reference. As a proof of concept of our approach, we create and analyze a phenomenological error model of a lidar-based sensor system. From eight hours of simultaneous traffic recordings at an intersection, we extract synchronized state vectors of associated true-positive vehicle tracks. We model the deviations of the full lidar state vectors from the reference as multivariate Gaussians. The dependency of their covariance matrices and mean vectors on the reference state vector is modeled by a fully-connected neural network. By customizing the network training procedure and losses, we are able to achieve consistent results even in sparsely populated areas of the state space. Finally, we show that time dependencies of errors can be considered separately during sampling by an autoregressive model.