학술논문

Margin-Aware Adaptive-Weighted-Loss for Deep Learning Based Imbalanced Data Classification
Document Type
Periodical
Source
IEEE Transactions on Artificial Intelligence IEEE Trans. Artif. Intell. Artificial Intelligence, IEEE Transactions on. 5(2):776-785 Feb, 2024
Subject
Computing and Processing
Training
Costs
Data models
Adaptation models
Tail
Robustness
Perturbation methods
CIFAR-10
class imbalance
deep learning (DL)
FMNIST
large margin softmax
loss function
Language
ISSN
2691-4581
Abstract
In supervised learning algorithms, the class imbalance problem often leads to generating results biased towards the majority classes. Present methods used to deal with the class imbalance problem ignore a principal aspect of separating the overlapping classes. This is the reason why most of these methods are prone to overfit on the training data. To this end, we propose a novel loss function, namely margin-aware adaptive-weighted loss. Here, we first use the large margin softmax to leverage intraclass compactness and interclass separability. Further to learn an unbiased representation of the classes, we put forward a dynamically weighted loss for imbalanced data classification. This weight dynamically adapts on every minibatch based on the inverse class frequencies. In addition, it takes care of the hard-to-train samples by using the confidence scores to learn discriminative hidden representations of the data. The overall framework is found to be effective when evaluated on the following two widely used datasets: 1) Canadian Institute for Advanced Research (CIFAR)-10 and 2) Fashion-MNIST. Additional experiments on human against machine and Asia Pacific tele-ophthalmology society 2019 blindness detection datasets prove the robustness of our methodology.