학술논문

Fast Modifications of the SpikeProp Algorithm
Document Type
Conference
Source
The 2006 IEEE International Joint Conference on Neural Network Proceedings Neural Networks, 2006. IJCNN '06. International Joint Conference on. :3970-3977 2006
Subject
Computing and Processing
Components, Circuits, Devices and Systems
Signal Processing and Analysis
Neurons
Fires
Equations
Biological system modeling
Signal processing
Delay effects
Mathematical model
USA Councils
Electronic mail
Neural networks
Language
ISSN
2161-4393
2161-4407
Abstract
In this paper we develop and analyze Spiking Neural Network (SNN) versions of Resilient Propagation (RProp) and QuickProp, both training methods used to speed up training in Artificial Neural Networks (ANNs) by making certain assumptions about the data and the error surface. Modifications are made to both algorithms to adapt them to SNNs. Results generated on standard XOR and Fisher Iris data sets using the QuickProp and RProp versions of SpikeProp are shown to converge to a final error of 0.5 - an average of 80% faster than using SpikeProp on its own.