Error analysis of the kernel regularized regression based on refined convex losses and RKBSs
Abstract
In this paper, we bound the errors of kernel regularized regressions associating with 22-uniformly convex two-sided RKBSs and differentiable σ(t)=|t|pp(1<p≤2)σ(t)=|t|pp(1<p≤2) uniformly smooth losses. In particular, we give learning rates for the learning algorithm with loss Vp(t)=|t|pp(1<p≤2)Vp(t)=|t|pp(1<p≤2). Also, we show a probability inequality and with which provide the error bounds for kernel regularized regression with loss Vq(t)=|t|qq(q>2).Vq(t)=|t|qq(q>2). The discussions are comprehensive applications of the uniformly smooth function theory, the uniformly convex function theory and uniformly convex space theory.