학술논문

Nonnegative sparse coding for discriminative semi-supervised learning
Document Type
Conference
Source
CVPR 2011 Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference on. :2849-2856 Jun, 2011
Subject
Computing and Processing
Components, Circuits, Devices and Systems
Signal Processing and Analysis
Communication, Networking and Broadcast Technologies
Encoding
Sparse matrices
Clustering algorithms
Prediction algorithms
Databases
Training
Machine learning
Language
ISSN
1063-6919
Abstract
An informative and discriminative graph plays an important role in the graph-based semi-supervised learning methods. This paper introduces a nonnegative sparse algorithm and its approximated algorithm based on the l 0 –l 1 equivalence theory to compute the nonnegative sparse weights of a graph. Hence, the sparse probability graph (SPG) is termed for representing the proposed method. The nonnegative sparse weights in the graph naturally serve as clustering indicators, benefiting for semi-supervised learning. More important, our approximation algorithm speeds up the computation of the nonnegative sparse coding, which is still a bottle-neck for any previous attempts of sparse non-negative graph learning. And it is much more efficient than using l 1 -norm sparsity technique for learning large scale sparse graph. Finally, for discriminative semi-supervised learning, an adaptive label propagation algorithm is also proposed to iteratively predict the labels of data on the SPG. Promising experimental results show that the nonnegative sparse coding is efficient and effective for discriminative semi-supervised learning.