학술논문

Initialization-Based k-Winners-Take-All Neural Network Model Using Modified Gradient Descent
Document Type
Periodical
Source
IEEE Transactions on Neural Networks and Learning Systems IEEE Trans. Neural Netw. Learning Syst. Neural Networks and Learning Systems, IEEE Transactions on. 34(8):4130-4138 Aug, 2023
Subject
Computing and Processing
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
General Topics for Engineers
Mathematical models
Biological neural networks
Neurons
Integrated circuit modeling
Convergence
Computational modeling
Hardware
Constraint conversion
gradient descent
k-winners-take-all (k-WTA)
optimization
Language
ISSN
2162-237X
2162-2388
Abstract
The $k$ -winners-take-all ( $k$ -WTA) problem refers to the selection of $k$ winners with the first $k$ largest inputs over a group of $n$ neurons, where each neuron has an input. In existing $k$ -WTA neural network models, the positive integer $k$ is explicitly given in the corresponding mathematical models. In this article, we consider another case where the number $k$ in the $k$ -WTA problem is implicitly specified by the initial states of the neurons. Based on the constraint conversion for a classical optimization problem formulation of the $k$ -WTA, via modifying the traditional gradient descent, we propose an initialization-based $k$ -WTA neural network model with only $n$ neurons for $n$ -dimensional inputs, and the dynamics of the neural network model is described by parameterized gradient descent. Theoretical results show that the state vector of the proposed $k$ -WTA neural network model globally asymptotically converges to the theoretical $k$ -WTA solution under mild conditions. Simulative examples demonstrate the effectiveness of the proposed model and indicate that its convergence can be accelerated by readily setting two design parameters.