학술논문

Ensemble Optimization for Invasive Ductal Carcinoma (IDC) Classification Using Differential Cartesian Genetic Programming
Document Type
Periodical
Source
IEEE Access Access, IEEE. 10:128790-128799 2022
Subject
Aerospace
Bioengineering
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Engineered Materials, Dielectrics and Plasmas
Engineering Profession
Fields, Waves and Electromagnetics
General Topics for Engineers
Geoscience
Nuclear Engineering
Photonics and Electrooptics
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Transportation
Training data
Optimization
Genetic programming
Classification algorithms
Image classification
Feature extraction
Histopathology
Cartesian genetic programming
hyparameters optimization
ensemble
CNNs
histolpathaological image classification
Language
ISSN
2169-3536
Abstract
The high cost of acquiring annotated histological slides for breast specimens entails exploiting an ensemble of models appropriately trained on small datasets. Histological Image Classification ensembles strive to accurately detect abnormal tissues in the breast samples by determining the correlation between the predictions of its weak learners. Nonetheless, the state-of-the-art ensemble methods, such as boosting and bagging, count merely on manipulating the dataset and lack intelligent ensemble decision making. Furthermore, the methods mentioned above are short of the diversity of the weak models of the ensemble. Likewise, other commonly used voting strategies, such as weighted averaging, are limited to how the classifiers’ diversity and accuracy are balanced. Hence, In this paper, we assemble a Neural Network ensemble that integrates the models trained on small datasets by employing biologically-inspired methods. Our procedure is comprised of two stages. First, we train multiple heterogeneous pre-trained models on the benchmark Breast Histopathology Images for Invasive Ductal Carcinoma (IDC) classification dataset. In the second meta-training phase, we utilize the differential Cartesian Genetic Programming (dCGP) to generate a Neural Network that merges the trained models optimally. We compared our empirical outcomes with other state-of-the-art techniques. Our results demonstrate that improvising a Neural Network ensemble using Cartesian Genetic Programming transcended formerly published algorithms on slim datasets.