학술논문

Learning Hierarchies from ICA Mixtures
Document Type
Conference
Source
2007 International Joint Conference on Neural Networks Neural Networks, 2007. IJCNN 2007. International Joint Conference on. :2271-2276 Aug, 2007
Subject
Computing and Processing
Components, Circuits, Devices and Systems
Signal Processing and Analysis
Independent component analysis
Vectors
Signal processing algorithms
Clustering algorithms
Probability density function
Parameter estimation
Automatic testing
Nondestructive testing
Neural networks
Merging
Language
ISSN
2161-4393
2161-4407
Abstract
This paper presents a novel procedure to classify data from mixtures of independent component analyzers. The procedure includes two stages: learning the parameters of the mixtures (basis vectors and bias terms) and clustering the ICA mixtures following a bottom-up agglomerative scheme to construct a hierarchy for classification. The approach for the estimation of the source probability density function is non-parametric and the minimum kullback-Leibler distance is used as a criterion for merging clusters at each level of the hierarchy. Validation of the proposed method is presented from several simulations including ICA mixtures with uniform and Laplacian source distributions and processing real data from impact-echo testing experiments.