期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Classification with Gaussians and convex loss Ⅱ:improving error bounds by noise conditions 被引量:3
1
作者 XIANG DaoHong 《Science China Mathematics》 SCIE 2011年第1期165-171,共7页
We continue our study on classification learning algorithms generated by Tikhonov regularization schemes associated with Gaussian kernels and general convex loss functions. Our main purpose of this paper is to improve... We continue our study on classification learning algorithms generated by Tikhonov regularization schemes associated with Gaussian kernels and general convex loss functions. Our main purpose of this paper is to improve error bounds by presenting a new comparison theorem associated with general convex loss functions and Tsybakov noise conditions. Some concrete examples are provided to illustrate the improved learning rates which demonstrate the effect of various loss functions for learning algorithms. In our analysis, the convexity of the loss functions plays a central role. 展开更多
关键词 reproducing kernel Hilbert space binary classification general convex loss tsybakov noise condition Sobolev space
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部