학술논문

The Rosenblatt Bayesian Algorithm Learning in a Nonstationary Environment
Document Type
Periodical
Source
IEEE Transactions on Neural Networks IEEE Trans. Neural Netw. Neural Networks, IEEE Transactions on. 18(2):584-588 Mar, 2007
Subject
Computing and Processing
Communication, Networking and Broadcast Technologies
Bayesian methods
Artificial neural networks
Covariance matrix
Neural networks
Neurons
Entropy
Gaussian distribution
Gradient methods
Pattern classification
Biological neural networks
Online gradient methods
pattern classification
time- varying environment
Language
ISSN
1045-9227
1941-0093
Abstract
In this letter, we study online learning in neural networks (NNs) obtained by approximating Bayesian learning. The approach is applied to Gibbs learning with the Rosenblatt potential in a nonstationary environment. The online scheme is obtained by the minimization (maximization) of the Kullback–Leibler divergence (cross entropy) between the true posterior distribution and the parameterized one. The complexity of the learning algorithm is further decreased by projecting the posterior onto a Gaussian distribution and imposing a spherical covariance matrix. We study in detail the particular case of learning linearly separable rules. In the case of a fixed rule, we observe an asymptotic generalization error $e_{g}\propto\alpha^{-1}$ for both the spherical and the full covariance matrix approximations. However, in the case of drifting rule, only the full covariance matrix algorithm shows a good performance. This good performance is indeed a surprise since the algorithm is obtained by projecting without the benefit of the extra information on drifting.