학술논문

Joint Sparse Locality Preserving Regression for Discriminative Learning
Document Type
Periodical
Source
IEEE Transactions on Emerging Topics in Computational Intelligence IEEE Trans. Emerg. Top. Comput. Intell. Emerging Topics in Computational Intelligence, IEEE Transactions on. 8(1):790-801 Feb, 2024
Subject
Computing and Processing
Feature extraction
Face recognition
Sparse matrices
Linear programming
Robustness
Principal component analysis
Optimization
Regression
jointly sparse
discriminative learning
feature selection
locality preserving
Language
ISSN
2471-285X
Abstract
Ridge Regression (RR) is a classical method that is widely used in multiple regression analysis. However, traditional RR does not take the local geometric structure of data into consideration for discriminative learning and it is sensitive to outliers as it is based on $L_{2}$-norm. To address this problem, this article proposes a novel method called Joint Sparse Locality Preserving Regression (JSLPR) for discriminative learning. JSLPR not only applies $L_{2,1}$-norm on both loss function and regularization term but also takes the local geometric structure of the data into consideration. The use of $L_{2,1}$-norm can guarantee the robustness to outliers or noises and the joint sparsity for effective feature selection. Taking the local geometric structure into consideration can improve the performance of the feature extraction and selection method when the data lie on a manifold. To solve the optimization problem of JSLPR, an iterative algorithm is proposed and the convergence of the algorithm is also proven. Experiments on four famous face databases are conducted and the results show the merit of the proposed JSLPR on feature extraction and selection.