학술논문

Generalizing Correspondence Analysis for Applications in Machine Learning
Document Type
Periodical
Source
IEEE Transactions on Pattern Analysis and Machine Intelligence IEEE Trans. Pattern Anal. Mach. Intell. Pattern Analysis and Machine Intelligence, IEEE Transactions on. 44(12):9347-9362 Dec, 2022
Subject
Computing and Processing
Bioengineering
Correlation
Random variables
Data visualization
Principal component analysis
Kernel
Optimization
Hilbert space
Correspondence analysis
principal inertia components
principal functions
maximal correlation
canonical correlation analysis
interpretability
visualization
multi-view learning
multi-modal learning
Language
ISSN
0162-8828
2160-9292
1939-3539
Abstract
Correspondence analysis (CA) is a multivariate statistical tool used to visualize and interpret data dependencies by finding maximally correlated embeddings of pairs of random variables. CA has found applications in fields ranging from epidemiology to social sciences. However, current methods for CA do not scale to large, high-dimensional datasets. In this paper, we provide a novel interpretation of CA in terms of an information-theoretic quantity called the principal inertia components. We show that estimating the principal inertia components, which consists in solving a functional optimization problem over the space of finite variance functions of two random variable, is equivalent to performing CA. We then leverage this insight to design algorithms to perform CA at scale. Specifically, we demonstrate how the principal inertia components can be reliably approximated from data using deep neural networks. Finally, we show how the maximally correlated embeddings of pairs of random variables in CA further play a central role in several learning problems including multi-view and multi-modal learning methods and visualization of classification boundaries.