期刊文献+

基于预测值替代的相关向量回归稳健化方法 被引量:1

Relevance Vector Regression Robustization Method Based on Predictive Value Replacement
在线阅读 下载PDF
导出
摘要 针对相关向量机的性能易受到奇异值影响的情况,提出了一种增强相关向量机稳健性的方法。其主要思想如下:首先用原始训练数据训练相关向量机;然后,利用某种准则,从原始数据中挑选一些样本,用其预测值代替输出变量值;随后,用改变后的训练样本重新训练相关向量机。这个过程可重复几次。数据试验表明,较之相关向量机和变分稳健相关向量机,新算法对奇异值更加不敏感。 Relevance vector machine(RVM)is seriously affected by examples with gross error.A method is proposed to enhance the robustness of RVM.Its main idea is as follows:after training a RVM on the original data set,the target values with large prediction errors are replaced by the predictive values of the RVM,and next RVM is trained on the new training data set.This process is done several times recursively.Some experiments are conducted and the results demonstrated the proposed approach is less sensitive to outliers in comparison with RVM and variational robust relevance vector machine(VRRVM),while reserving the sparseness of RVM.
作者 郭高 鞠花
机构地区 西安理工大学
出处 《软件》 2012年第6期1-5,共5页 Software
基金 陕西省教育厅2009年科学技术研究计划基金(项目编号:09JK615)
关键词 人工智能 支撑向量机 相关向量机 稀疏性 稳健性 奇异值 Artificial Intelligence Support Vector Machine Relevance Vector Machine Sparseness Robustness Outlier
  • 相关文献

参考文献15

  • 1HAWKINS DM. Identification of Outliers[M].London:Chapman and Hall,1980.
  • 2张讲社,郭高.加权稳健支撑向量回归方法[J].计算机学报,2005,28(7):1171-1177. 被引量:13
  • 3XU H,CONSTANTINE C,SHIE M. Sparse algorithms are not stable:a no-free-lunch theorem[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2012,(02):187-193.
  • 4GUO G,ZHANG G Y,ZHANG J S. A method to sparsify the solution of support vector regression[J].Neural Computing & Applications,2010,(01):115-122.
  • 5TIPPING M E. Sparse Bayesian Learning and the Relevance Vector Machine[J].Journal of Machine Learning Research,2001.211-244.
  • 6VAPNIK V. The Nature of Statistical Learning Theory[M].New York:springer-verlag,1995.
  • 7SCH LKOPF B;SMOLA A J.Learning with KernelsMIT[M]美国:麻省理工学院,2002.
  • 8FAUL A,TIPPING M E. A variational approach to robust regression[A].Springer-verlag,2001.95-102.
  • 9TIPPING M E,LAWRENCE N D. Variational Inference for Student-t Models:Robust Bayesian Interpolation and Generalized Component Analysis[J].Neurocomputing,2005.123-141.
  • 10SUYKENS A J K,D E BRABANTER J,LUKAS L,VANDEWALLE J. Weighted Least Squares Support Vector Machines:Robustness and Sparse Approximation[J].Neurocomputing,2002,(1-4):-85-105.

二级参考文献16

  • 1[1]T. Masters ,Neural,Novel& Hybird Algorithms for Tim Series Pre-diction[M], John Wiley & Sons. Inc., 1995.
  • 2[2]A. D. Papalexopoulos and T. C. Hesterberg , A regression based approach to short term system load forecasting[C], Proceedings of 1989 PICA Conference , 1989:414-423,
  • 3[3]K. L. Ho , Y. Y. Hsu , C. F. Chen , T. E. Lee , C. C. Liang , T . S. Lai , and K. K. Chen , Short term load foreasting of Taiwan power system using a knowledge-based expert system[J], IEEE Tans.on Power Systems , 1990,5(4):1214-1221.
  • 4[4]A.M. Lanchlan , An improved novelty criterion for resource allocating networks[C] , IEE ,Artifical Neural Networks , Conference Publication , 1997:440:48-52
  • 5[5]D.Srinivasan, S.S.Tan , C.S.Chang and E.K.Chan ,Practical im-plentation of a hybrid fuzzy neural network for one-day-ahead load forecasting[J], IEE Proc.-Gener. Transm,1998.11(6):687-692.
  • 6[6]V.N. Vapnik ,The nature of statistical learning theory[M], New York: Springer, 1999.
  • 7[7]A. Smola and B. Scholkopf , A tutorial on support vector regression[M], NeuroCOLT Tech. Rep. TR 1998-030, Royal Holloway College , London , U.K., 1998.
  • 8[8]J.C. Platt , Fast training of support vector machines using sequential optimization , in B. Scholkopf , C. Burges , and A. Smola. Advances in kernel methods: support vector machines[M], Cambridge, MA: MIT Press, 1998.
  • 9[9]S.K.Shevade , S.S. Keerthi , C. Bhattacharyy and K.R.K. Murthy , Im-provements to SMO algorithm for SVM regression[J], IEEE Trans. on Neural Networks,2000,11(5): 1188-1193.
  • 10张学工译.统计学习理论的本质[M].北京:清华大学出版社,2000..

共引文献114

同被引文献14

引证文献1

二级引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部