期刊文献+

基于回归问题的选择性集成算法 被引量:2

Selective Ensemble Algorithm Based on Regression Problems
在线阅读 下载PDF
导出
摘要 提出一种应用于回归问题,以分类回归树为基学习器,并综合Boosting和Bagging算法的特点,利用变相似度聚类技术和贪婪算法来进行选择性集成学习的算法——SER-BagBoosting Trees算法。将其与几种常用的机器学习算法进行比较研究,得出该算法往往比其他集成学习算法具有更好的泛化性能和更高的运行效率。 This paper introduces a new ensemble algorithm, SER-BagBoosting Trees ensemble algorithm, which is a combination of tree predictors and is based on variational similarity cluster technology and greedy method, and it is combined with the features of Boosting and Bagging. Compared with a series of other learning algorithms, it often has better generalization ability and higher running efficiency.
作者 陈凯
出处 《计算机工程》 CAS CSCD 北大核心 2009年第21期17-19,共3页 Computer Engineering
基金 国家自然科学基金资助重点项目(10431010) 教育部重点基地基金资助重大项目(05JJD910001) 中国人民大学应用统计中心基金资助项目
关键词 分类回归树 自助抽样 选择性集成 Classification and Regression Trees(CART) bootstrap selective ensemble
  • 相关文献

参考文献9

  • 1Dietterich T G. Machine Learning Research: Four Current Directions[J]. AI Magazine, 2000, 18(4): 97-136.
  • 2Breiman L. Bagging Predictors[J]. Machine Learning, 1996, 24(2): 123-140.
  • 3Schapire R E, Freund Y, Bartlett P, et al. Boosting the Margin: A New Explanation for the Effectiveness of Voting Methods[J]. The Annals of Statistics, 1998, 26(5): 1651-1686.
  • 4Breiman L. Arcing the Edge[J]. The Annals of Statistics, 1998, 26(3): 801-823.
  • 5Brciman L. Random Forests[J]. Machine Learning, 2001, 45(1): 5-32.
  • 6Zhou Zhihua, Wu Jianxin, Tang Wei. Ensembling Neural Networks: Many Could be Better than All[J]. Artificial Intelligence, 2002, 137(1): 239-263.
  • 7Breiman L, Friedman J H, Olshen R A, et al. Classification and Regression Trees[M]. New York, USA: Chapman & Hall Press, 1984.
  • 8Dealing M. BagBoosting for Tumor Classification with Gene Expression Data[J]. Bioinformatics, 2004, 20(5): 3583-3593.
  • 9Breiman L. Statistical Modeling: The Two Cultures[J]. Statistical Science, 2001, 16(3): 199-231.

同被引文献13

  • 1熊志化,张继承,邵惠鹤.基于高斯过程的软测量建模[J].系统仿真学报,2005,17(4):793-794. 被引量:37
  • 2Rasmussen C E, Williams C K I. Gaussian processes for machine learning [ M ]. Cambridge: MIT Press, 2006.
  • 3Kim H C, Pang S N. Constructing support vector machine ensemble [ J ]. Pattern Recognition, 2003, 36 ( 12 ) : 2757 - 2767.
  • 4Haas P J, Naughton J F, Seshadri S, et al. Sampling-based Estimation of the Number of Distinct Values of an Attribute[C]// Proc. of the 21st International Conference on Very LargeDatabases. Zurich, Switzerland: [s. n.], 1995:311-322.
  • 5Haas P J, Naughton J F, Seshadri S, et al. Selectivity and Cost Estimation for Joins Based on Random Sampling[J]. Journal of Computer System Science, 1996, 52(3): 550-569.
  • 6Acharta S, Gibbons P B, Poosala V, et al. The Aqua Approximate Query Answering System[C]//Proc. of ACM International Conference on Management of Data. Providence, Rhode Island,USA: ACM Press, 2009: 574-576.
  • 7Hall P. The Bootstrap and Edgeworth Expansion[M]. Berlin, Germany: Springer-Verlag, 1995.
  • 8Transaction Processing Performance Council. TPC Benchmark H (Decision Support)[EB/OL]. [2010-10-02]. http://www.tpc.org/tpch.
  • 9王华忠.高斯过程及其在软测量建模中的应用[J].化工学报,2007,58(11):2840-2845. 被引量:22
  • 10王立,朱学峰.一种基于迭代Bagging的回归算法[J].控制工程,2009,16(1):59-61. 被引量:4

引证文献2

二级引证文献9

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部