학술논문

Bidirectional Associative Memories: Unsupervised Hebbian Learning to Bidirectional Backpropagation
Document Type
Periodical
Author
Source
IEEE Transactions on Systems, Man, and Cybernetics: Systems IEEE Trans. Syst. Man Cybern, Syst. Systems, Man, and Cybernetics: Systems, IEEE Transactions on. 51(1):103-115 Jan, 2021
Subject
Signal Processing and Analysis
Robotics and Control Systems
Power, Energy and Industry Applications
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
General Topics for Engineers
Backpropagation
Training
Associative memory
Neurons
Supervised learning
Backpropagation algorithms
Synapses
Bidirectional associative memory (BAM)
bidirectional backpropagation
global stability
Hebbian learning
Language
ISSN
2168-2216
2168-2232
Abstract
Bidirectional associative memories (BAMs) pass neural signals forward and backward through the same web of synapses. Earlier BAMs had no hidden neurons and did not use supervised learning. They tuned their synaptic weights with unsupervised Hebbian or competitive learning. Two-layer feedback BAMs always converge to fixed-point equilibria for threshold or threshold-like neurons. Every rectangular connection matrix is bidirectionally stable. These simpler BAMs extend to arbitrary hidden layers with supervised learning if the resulting bidirectional backpropagation algorithm uses the proper layer likelihood in the forward and backward directions. Bidirectional backpropagation lets users run deep classifiers and regressors in reverse as well as forward. Bidirectional training exploits pattern and synaptic information that forward-only running ignores.