학술논문

Multi-objective Filter-based Feature Selection Using NSGAIII With Mutual Information and Entropy
Document Type
Conference
Source
2020 2nd International Conference on Computer and Information Sciences (ICCIS) Computer and Information Sciences (ICCIS), 2020 2nd International Conference on. :1-7 Oct, 2020
Subject
Computing and Processing
Robotics and Control Systems
Signal Processing and Analysis
Entropy
Error analysis
Filtering algorithms
Redundancy
Optimization
Information filters
Mutual information
Feature Selection
Multi-Objective Optimization
NSGAII
NSGAIII
Filter-Based
Mutual Information
Language
Abstract
Feature selection (FS) aims to select the subsets of the most informative features by ignoring the redundant ones and consequently, improving the classification performance. Hence, consider as a two objective optimisation problem. Moreover, most of the existing work treats FS as single-objective by combining the two aims into a single fitness function. As such, there is a trade-off between the number of selected features and classification performance. To create a balance between the conflicting aim of the FS and yet improve classification performance, this study proposes the use of nondominated sorting genetic algorithm NSGAIII. Filter-based FS are scalable to large dimensional datasets and computationally fast. However, their classification performance is low because they lack feature interaction among the selected subset of features. Based on that mutual information (MI) along with entropy, are proposed as a filter-based evaluation measure along with the NSGAIII to have NSGAIIIMI and NSGAIIIE. The results obtained was compared with the existing single-objective, NSGAII as well as strength Pareto evolutionary algorithm with both MI and entropy. NSGAIII can successfully evolve the set of nondominated solutions and performs better in terms of the number of selected features, classification error rate and computational time on the majority of the datasets.