학술논문

Dense Broad Learning System based on Conjugate Gradient
Document Type
Conference
Source
2020 International Joint Conference on Neural Networks (IJCNN) Neural Networks (IJCNN), 2020 International Joint Conference on. :1-6 Jul, 2020
Subject
Bioengineering
Computing and Processing
General Topics for Engineers
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Learning systems
Gradient methods
Machine learning
Least mean squares methods
Neural networks
Training data
Training
broad learning system
conjugate gradient
neural networks with random weights
random vector functional link neural network
Language
ISSN
2161-4407
Abstract
Conventional training mechanism for deep learning, which is based on gradient descent, suffers from many notorious issues such as low convergence rate, over-fitting, and time-consuming. To alleviate these problems, a novel deep learning algorithm with a different learning mechanism named Broad Learning System (BLS) was proposed by Prof. C. L. Philip Chen in 2017. BLS randomly selects the parameters of the feature nodes and enhancement nodes during its training process and uses the ridge regression theory to solve its output weights. BLS has been widely used in many fields because of its high efficiency. However, there is a fundamental problem that has not yet been solved, that is, the appropriate value of the parameter λ for the ridge regression operation of BLS is difficult to be set properly, which often leads to the problem of over-fitting and seriously limits the development of BLS. To solve this problem, we proposed a novel Dense BLS based on Conjugate Gradient (CG-DBLS) in this paper, in which each feature node is connected to other feature nodes and each enhancement node is connected to other enhancement nodes in a feed-forward fashion. The recursive least square method and conjugate gradient method are used to calculate the output weights of the feature nodes and enhancement nodes respectively. Experiment studies on four benchmark regression problems from UCI repository show that CG-DBLS can achieve much lower error and much higher stability than BLS and its variants.