학술논문

Activation Functions for Convolutional Neural Networks: Proposals and Experimental Study
Document Type
Periodical
Source
IEEE Transactions on Neural Networks and Learning Systems IEEE Trans. Neural Netw. Learning Syst. Neural Networks and Learning Systems, IEEE Transactions on. 34(3):1478-1488 Mar, 2023
Subject
Computing and Processing
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
General Topics for Engineers
Feature extraction
Training
Neurons
Proposals
Neural networks
Dispersion
Convolution
Activation functions
convolutional networks
ELUs+2
exponential linear unit (ELU)
s+2
Language
ISSN
2162-237X
2162-2388
Abstract
Activation functions lie at the core of every neural network model from shallow to deep convolutional neural networks. Their properties and characteristics shape the output range of each layer and, thus, their capabilities. Modern approaches rely mostly on a single function choice for the whole network, usually ReLU or other similar alternatives. In this work, we propose two new activation functions and analyze their properties and compare them with 17 different function proposals from recent literature on six distinct problems with different characteristics. The objective is to shed some light on their comparative performance. The results show that the proposed functions achieved better performance than the most commonly used ones.