학술논문

Effect of Pruning on Catastrophic Forgetting in Growing Dual Memory Networks
Document Type
Conference
Source
2019 International Joint Conference on Neural Networks (IJCNN) Neural Networks (IJCNN), 2019 International Joint Conference on. :1-8 Jul, 2019
Subject
Computing and Processing
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Neurons
Synapses
Biological neural networks
Training
Semantics
Knowledge engineering
Network topology
catastrophic forgetting
grow-when-required networks
network topology
Language
ISSN
2161-4407
Abstract
Grow-when-required networks such as the Growing Dual-Memory (GDM) networks possess a dynamic network structure, expanding to accommodate new neurons in response to learning novel concepts. Over time, it may be necessary to prune obsolete neurons and/or neural connections to meet performance or resource limitations. GDM networks utilize an age-based pruning strategy, whereby older neurons and neural connections that have not been activated recently are removed. Catastrophic forgetting occurs when knowledge learned by the networks in previous learning iterations is lost due to being overwritten by newer learning iterations, or to the pruning process. In this work, we investigate catastrophic forgetting in GDM networks in response to different pruning strategies. The age-based pruning method was shown to significantly sparsify the GDM network topology while improving the networks ability to recall newly acquired concepts with a slight decrease in performance with respect to older knowledge. A significance-based pruning method was tested as a replacement for the age-based pruning, but was not as effective at pruning even though it performed better at recalling older knowledge.