World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

Error analysis of the kernel regularized regression based on refined convex losses and RKBSs

    https://doi.org/10.1142/S0219691321500120Cited by:6 (Source: Crossref)

    In this paper, we bound the errors of kernel regularized regressions associating with 22-uniformly convex two-sided RKBSs and differentiable σ(t)=|t|pp(1<p2)σ(t)=|t|pp(1<p2) uniformly smooth losses. In particular, we give learning rates for the learning algorithm with loss Vp(t)=|t|pp(1<p2)Vp(t)=|t|pp(1<p2). Also, we show a probability inequality and with which provide the error bounds for kernel regularized regression with loss Vq(t)=|t|qq(q>2).Vq(t)=|t|qq(q>2). The discussions are comprehensive applications of the uniformly smooth function theory, the uniformly convex function theory and uniformly convex space theory.

    AMSC: 90C25, 68Q32, 68T40, 41A25