학술논문

Exploiting Event Cameras for Spatio-Temporal Prediction of Fast-Changing Trajectories
Document Type
Conference
Source
2020 2nd IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS) Artificial Intelligence Circuits and Systems (AICAS), 2020 2nd IEEE International Conference on. :108-112 Aug, 2020
Subject
Bioengineering
Components, Circuits, Devices and Systems
Computing and Processing
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Cameras
Trajectory
Robot vision systems
Target tracking
Visualization
Language
Abstract
This paper investigates trajectory prediction for robotics, to improve the interaction of robots with moving targets, such as catching a bouncing ball. Unexpected, highly-non-linear trajectories cannot easily be predicted with regression-based fitting procedures, therefore we apply state of the art machine learning, specifically based on Long-Short Term Memory (LSTM) architectures. In addition, fast moving targets are better sensed using event cameras, which produce an asynchronous output triggered by spatial change, rather than at fixed temporal intervals as with traditional cameras. We investigate how LSTM models can be adapted for event camera data, and in particular look at the benefit of using asynchronously sampled data.