摘要
支持向量机的研究是当前人工智能领域的研究热点。基于支持向量机的大样本回归问题一直是一个非常具有挑战性的课题。最近,基于递归最小二乘算法,Engel等人提出了核递归最小二乘算法。文中基于块增量学习和逆学习过程,提出了自适应迭代回归算法。为了说明两种方法的性能,论文在训练速度、精度和支持向量数量等方面,对它们做了比较。模拟结果表明:核递归最小二乘算法所得到的支持向量个数比自适应迭代回归算法少,而训练时间比自适应迭代回归算法的训练时间长,训练和测试精度也比自适应迭代回归算法差。
At present,studies for support vector machines are hot topics in the field of artificial intelligence.It is a very challenging work to deal with large regression problems based on support vector machines.Recently,Engel proposed kernel recursive least square algorithm(KRIS) based on recursive least square.This paper presents an adaptive and iterarive support vector machine regression algorithm(CAISVR) based on chunking incremental learning and decremental learning procedures.In order to show their performances,comparison is given focusing on training speeds,training accuracy,generalization performances and the number of support vectors.The simulating results show that although the number of support vectors obtained by KRIS is smaller than that obtained by CAISVM,the training time of KRIS is longer than that of CAISVR,the training and testing accuracies of KRIS are lower than those of CAISVM too.
出处
《计算机工程与应用》
CSCD
北大核心
2006年第6期36-38,57,共4页
Computer Engineering and Applications
基金
国家自然科学基金资助项目(编号:10471045)
广东省自然科学基金(编号:031360
04020079)
华南理工大学高水平大学建设苗子项目(编号:D76010)
关键词
支持向量机
自适应迭代回归算法
核递归最小二乘算法
大样本回归
support vector machines,adaptive and iterative regression algorithm,Kernel recursive least square algorithm, large regression