World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

Relaxed least square regression with 2,12,1-norm for pattern classification

    https://doi.org/10.1142/S021969132350025XCited by:7 (Source: Crossref)

    This work aims to address two issues that often exist in least square regression (LSR) models for classification tasks, which are (1) learning a compact projection matrix for feature selection and (2) adopting relaxed regression targets. To this end, we first propose a sparse regularized LSR framework for feature selection by introducing the 2,12,1 regularizer. Second, we utilize two different strategies to relax the strict regression targets based on the sparse framework. One way is to exploit the 𝜀ε-dragging technique. Another strategy is to directly learn the labels from the inputs and constrain the distance between true and false classes simultaneously. Hence, more feasible regression schemes are constructed, and the models will be more flexible. Further, efficient iterative methods are derived to optimize the proposed models. Various experiments on image databases intend to manifest our proposed models have outstanding recognition capability compared with many state-of-the-art classifiers.

    AMSC: 22E46, 53C35, 57S20