World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

A UNIFICATION OF COMPONENT ANALYSIS METHODS

    https://doi.org/10.1142/9789814273398_0001Cited by:3 (Source: Crossref)
    Abstract:

    Over the last century Component Analysis (CA) methods such as Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA), Canonical Correlation Analysis (CCA), k-means and Spectral Clustering (SC) have been extensively used as a feature extraction step for modeling, classification, visualization and clustering. CA techniques are appealing because many can be formulated as eigen-problems, offering great potential for learning linear and non-linear representations of data without local minima. However, the eigen-formulation often conceals important analytic and computational drawbacks of CA techniques, such as solving generalized eigen-problems with rank deficient matrices, lacking intuitive interpretation of normalization factors, and understanding relationships between CA methods.

    This chapter proposes a unified framework to formulate many CA methods as a least-squares estimation problem. We show how PCA, LDA, CCA, k-means, spectral graph methods and kernel extensions correspond to a particular instance of a least-squares weighted kernel reduced rank regression (LS-WKRRR). The least-squares formulation allows better understanding of normalization factors, provides a clean framework to understand the communalities and differences between many CA methods, yields efficient optimization algorithms for many CA algorithms, suggest easy derivation for on-line learning methods, and provides an easier generalization of CA techniques. In addition, we derive weighted generalizations of PCA, LDA, SC and CCA (including kernel extensions).