학술논문

Can Boosted Randomness Mimic Learning Algorithms of Geometric Nature? Example of a Simple Algorithm That Converges in Probability to Hard-Margin SVM.
Document Type
Article
Source
IEEE Transactions on Neural Networks & Learning Systems. Sep2021, Vol. 32 Issue 9, p3798-3818. 21p.
Subject
*MACHINE learning
*ALGORITHMS
*MARKOV processes
*SUPPORT vector machines
*PYTHON programming language
*HYPERPLANES
Language
ISSN
2162-237X
Abstract
In the light of the general question posed in the title, we write down a very simple randomized learning algorithm, based on boosting, that can be seen as a nonstationary Markov random process. Surprisingly, the decision hyperplanes resulting from this algorithm converge in probability to the exact hard-margin solutions of support vector machines (SVMs). This fact is curious because the hard-margin hyperplane is not a statistical solution, but a purely geometric one—driven by margin maximization and strictly dependent on particular locations of some data points that are placed in the contact region of two classes, namely the support vectors. The proposed algorithm detects support vectors probabilistically, without being aware of their geometric definition. We give proofs of the main convergence theorem and several auxiliary lemmas. The analysis sheds new light on the relation between boosting and SVMs and also on the nature of SVM solutions since they can now be regarded equivalently as limits of certain random trajectories. In the experimental part, correctness of the proposed algorithm is verified against known SVM solvers: libsvm, liblinear, and also against optimization packages: cvxopt (Python) and Wolfram Mathematica. [ABSTRACT FROM AUTHOR]