KOR

e-Article

How the Brain Achieves Real-Time Vision: A Spiking Position Perception Model
Document Type
Periodical
Source
IEEE Transactions on Cognitive and Developmental Systems IEEE Trans. Cogn. Dev. Syst. Cognitive and Developmental Systems, IEEE Transactions on. 16(3):961-972 Jun, 2024
Subject
Computing and Processing
Signal Processing and Analysis
Neurons
Delays
Visualization
Voltage control
Real-time systems
Neuromorphics
Robots
Dynamic vision sensor (DVS)
neural delays
neuromorphic chip
position perception
spiking neural networks (SNNs)
Language
ISSN
2379-8920
2379-8939
Abstract
Real-time visual perception is essential for animals to survive in a complicated natural environment and for robots to interact with moving targets. However, delays generally occur during the signal transfer and processing both for animals and robots, and these delays would produce errors during real-time interactions with the physical world. Natural facts have shown that animals can perfectly compensate for these pervasive delays. In this article, we propose a novel and effective position perception model (PPM) based on spiking neural networks (SNNs) to address this ambivalent situation in robotic vision systems. We investigate the performance of PPM by tracking a moving target. PPM can compensate for temporal delays in the system regardless of the target’s speed. We also present a deep version of PPM (dPPM). dPPM can handle some more complex situations and make long-term anticipations. We finally implement PPM on neuromorphic chips and test it on real dynamic vision sensor (DVS) data, and it can perform real-time or anticipative visual perceptions.