학술논문

Fast ramp fraction loss SVM classifier with low computational complexity for pattern classification
Document Type
Article
Source
In Neural Networks April 2025 184
Subject
Language
ISSN
0893-6080
Abstract
The support vector machine (SVM) is a powerful tool for pattern classification thanks to its outstanding efficiency. However, when encountering extensive classification tasks, the considerable computational complexity may present a substantial barrier. To reduce computational complexity, the novel ramp fraction loss SVM model called Lrf-SVM is introduced. The aim of this model is to simultaneously achieve sparsity and robustness. Utilizing our proposed proximal stationary point, we develop a novel optimality theory to the nonsmooth and nonconvex Lrf-SVM. Drawing upon this innovative theory, a novel efficient alternating direction method of multipliers (ADMM) incorporating a working set with low computational complexity will be introduced for handing Lrf-SVM. Moreover, our algorithm has shown that it can achieve global convergence. Our algorithm has been shown to be highly efficient through numerical experiments, surpassing nine other top solvers regarding number of support vectors, speed of computation, accuracy of classification and robust to outliers. For example, for addressing the real dataset over 107 samples, our algorithm can finish the classification in just 18.67 s, a notable enhancement in comparison to other algorithms that need at least 605.3 s.