World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.
https://doi.org/10.1142/S0218488525500096Cited by:0 (Source: Crossref)

Distance metric learning and nonlinear dimensionality reduction are intrinsically related, since they are both different perspectives of the same fundamental problem: to learn compact and meaningful data representations for classification and visualization. In this paper, we propose a graph-based generalization of Semi-Supervised Dimensionality Reduction (SSDR) algorithm that uses stochastic distances (Kullback-Leibler, Bhattacharyya and Cauchy-Schwarz divergences) to compute the similarity between local multivariate Gaussian distributions along the K Nearest Neighbors (KNN) graph build from the samples in the input high-dimensional space. In summary, there are two variants of the proposed method: one which uses only a fraction of the labeled samples (10%) and another that also uses a clustering method (Gaussian Mixture Models) to estimate the labels of the minimum spanning tree of the KNN graph, incorporating more information into the process. Experimental results with several real datasets show that the proposed method is able to improve the classification accuracy of several supervised classifiers and also the quality of the obtained clusters (Silhouette Coefficients) in comparison to the regular SSDR algorithm, making it a viable alternative for pattern classification problems.