학술논문

Elastic Deep Sparse Self-Representation Subspace Clustering Network
Document Type
Original Paper
Source
Neural Processing Letters. 56(2)
Subject
Subspace clustering
Representation learning
Deep auto-encoder
Elastic net regularization
Deep sparse feature
Language
English
ISSN
1573-773X
Abstract
Subspace clustering model based on self-representation learning often use ℓ1,ℓ2ℓ1ℓ2ℓ1ℓ2 or kernel norm to constrain self-representation matrix of the dataset. In theory, ℓ1,ℓ2ℓ1ℓ2ℓ1ℓ2 norm can constrain the independence of subspaces, but which may lead to under-connection because the sparsity of the self-representation matrix. ℓ1,ℓ2ℓ1ℓ2ℓ1ℓ2 and nuclear norm regularization can improve the connectivity between clusters, but which may lead to over-connection of the self-representation matrix. Because a single regularization term may cause subspaces to be over or insufficiently divided, this paper proposes an elastic deep sparse self-representation subspace clustering network (EDS-SC), which imposes sparse constraints on deep features, and introduces the elastic network regularization mixed ℓ1,ℓ2ℓ1ℓ2ℓ1ℓ2 and ℓ1,ℓ2ℓ1ℓ2ℓ1ℓ2 norm to constraint self-representation matrix. The network can extract deep sparse features and provide a balance between subspace independence and connectivity. Experiments on human faces, objects, and medical imaging datasets prove the effectiveness of EDS-SC network.