摘要
最大相关熵回归在信号处理领域有广泛应用,其收敛性分析是机器学习领域中的热门研究课题.本文给出一种新的误差分析框架,将非凸优化问题转化为局部凸优化问题,然后应用凸分析方法给出最大相关熵回归(MCCR)收敛性的理论分析;将最优化回归函数表示成一种积分方程的解,用K-泛函和再生核Hilbert空间最佳逼近表示泛化误差,给出学习速度的一种上界估计.
The maximum correntropy criterion induced regression(MCCR)has been used frequently in the field of signal processing.The consistency property analysis for MCCR has become an increasing attention topic in learning theory.We provide a new framework for analyzing learning error.We transform the non-convex kernel regularized problem into a local convex optimization,and then give theoretical analysis to the convergence of the kernel regularized MCCR.We express the optimal regression function as the solution of an integral equation,and bound the generalization error of the kernel regularized MCCR with a K-functional and the best reproducing kernel Hilbert space approximation.
作者
孙小军
盛宝怀
SUN Xiaojun;SHENG Baohuai(Department of Economic Statistics,School of International Business,Zhejiang Yuexiu University,Shaoxing,Zhejiang,312000,P.R.China;Department of Applied Statistics,Shaoxing University,Shaoxing,Zhejiang,312000,P.R.China)
出处
《数学进展》
CSCD
北大核心
2024年第3期633-652,共20页
Advances in Mathematics(China)
基金
Supported partially by NSFC(No.61877039)
the NSFC/RGC Joint Research Scheme(Nos.12061160462,N_CityU102/20)。
关键词
学习理论
最大熵判据
核正则化回归
学习速度
learning theory
maximum correntropy criterion
kernel regularized regression
learning rate