학술논문

Exploring the Properties and Evolution of Neural Network Eigenspaces during Training
Document Type
Conference
Source
2022 International Conference on Machine Vision and Image Processing (MVIP) Machine Vision and Image Processing (MVIP), 2022 12th Iranian/Second International Conference on. :1-6 Feb, 2022
Subject
Computing and Processing
Training
Measurement
Visualization
Pathology
Machine vision
Neural networks
Complexity theory
convolutional neural networks
logistic regression probes
saturation
Language
ISSN
2166-6784
Abstract
We investigate properties and the evolution of the emergent inference process inside neural networks using layer saturation [1] and logistic regression probes [2]. We demonstrate that the difficulty of a problem, defined by the number of classes and complexity of the visual domain, as well as the number of parameters in neural network layers affect the predictive performance in an antagonistic manner. We further show that this relationship can be measured using saturation. This opens the possibility of detecting over- and under-parameterization of neural networks. We further show that the observed effects are independent of previously reported pathological patterns like the "tail pattern" described in [1]. Finally, we study the emergence of saturation patterns during training, showing that saturation patterns emerge early during training. This allows for early analysis and potentially increased cycle-time during experiments.