期刊文献+

基于局部密度的单类分类器LP改进算法 被引量:3

Improved LP Algorithms of One-Class Classifier Based on Local Density Factor
在线阅读 下载PDF
导出
摘要 为了提高基于支持域的单类分类器识别率,提出将局部密度加入到分类器设计当中。在Campbe ll等的LP算法基础上,通过k近邻方法对每个样本点引入局部密度因子pi,重新刻画了原算法,使处于不同密度区的数据对分类器的作用不再被同等对待,高密度区的数据对分类超平面作用被强化,而低密度区的数据被削弱,结果使分类超平面自动靠近高密度区而提高了识别率。真实数据集上的实验结果表明,引入局部密度的D-LP算法其泛化性能较原算法有较大提高。 To improve the accuracy of domain-based one-class classifiers, a novel method is incorporated local density information of classifier design. Using the k-nearest neighbor algorithm, Campbell' s linear programming (LP) algorithm is reformulated by introducing a local density factor for each data point. This density-based LP (D-LP) algorithm does not treat each point as equivalent ones are in the original LP. It modifies their attributions to the hyperplane according to their place in the density distribution. The points in the dense area are emphasized and the ones in the sparse domain are weakened. So the classification hyperplane is automatically attracted towards the emphasized density region, and the accuracy is improved. Experiments with real data sets show that the local density-based LP algorithm is superior to the original on generalized performances.
作者 冯爱民 陈斌
出处 《南京航空航天大学学报》 EI CAS CSCD 北大核心 2006年第6期727-731,共5页 Journal of Nanjing University of Aeronautics & Astronautics
基金 南京航空航天大学青年科研基金(1004-274015)的资助
关键词 单类分类器 线性规划 支持域 局部密度因子 one/single-class classifier linear programming (LP) support domain local density factor
  • 相关文献

参考文献12

  • 1Tax D M J.One-class classification:concept-learning in the absence of counter-examples[D].Delft,Netherlands:Delft University of Technology,2001.
  • 2Scholkopf B,Platt J,Shawe-Taylor J,et al.Estimating the support of a high-dimensional distribution[J].Neural Computation,2001,13(7):1443-1471.
  • 3Tax D M J,Duin R P W.Support vector data description[J].Machine Learning,2004,54(1):45-66.
  • 4Campbell C,Bennett P.A linear programming approach to novelty detection[C]//Advances in Neural Information Processing Systems.Cambridge:MIT Press,2001.
  • 5Breunig M,Kriegel,H.LOF:indentifying density-based local outliers[C]//Proceedings of the ACM SIGMOD,International Conference on Management of Data.Texas:ACM Press,2000:93-104.
  • 6Vapnik V.Statistical learning theory[M].New York:Addison-Wiley,1998.
  • 7Rotsch G,Mika S,Scholkopf B,et al.Constructing boosting algorithms from SVMs:an application to one-class classification[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2002,24(9):1184-1199.
  • 8Scholkopf B,Burges C,Vapnik V.Extracting support data for a given task[C]//Fayad U,Uthurusamy R.Ed.First International Conference on Knowledge Discovery & Data Mining.Menlo Park:AAAI Press,1995.
  • 9许建华,张学工,李衍达.支持向量机的新发展[J].控制与决策,2004,19(5):481-484. 被引量:132
  • 10Alberto M,Javier M.One-class support vector machines and density estimation:the precise relation[C]//Alberto S,Jose F,Jesus A.ed.Progress in Pattern Recognition,LNCS.Berlin:Springer,2004:216-220.

二级参考文献25

  • 1[1]Boser B E, Guyon I M, Vapnik V N. A training algorithm for optimal margin classifiers[A]. The 5th Annual ACM Workshop on COLT [C]. Pittsburgh:ACM Press, 1992. 144-152.
  • 2[2]Cortes C, Vapnik V N. Support vector networks[J].Machine Learning, 1995, 20(3): 273-297.
  • 3[3]Drucker H, Burges C J C, Kaufman L, et al. Support vector regression machines [A]. Advances in Neural Information Processing Systems[C]. Cambridge: MIT Press, 1997. 155-161.
  • 4[4]Vapnik V N, Golowich S, Smola A. Support vector method for function approximation, regression estimation and signal processing [A]. Advances in Neural Information Processing Systems [ C ].Cambridge: MIT Press, 1997. 281-287.
  • 5[5]Vapnik V N. The Nature of Statistical Learning Theory[M]. New York: Springer-Verlag, 1995.
  • 6[6]Vapnik V N. Statistical Learning Theory [M]. New York: Wiley, 1998.
  • 7[7]Vapnik V N. The Nature of Statistical Learning Theory [M]. 2nd edition. New York: SpringerVerlag, 1999.
  • 8[8]Platt J. Fast training of support vector machines using sequential minimal optimization [ A ]. Advances in Kernel Methods - Support Vector Learning [C].Cambridge: MIT Press, 1999. 185-208.
  • 9[9]Suykens J A K, Vandewalle J. Least squares support vector machines [J]. Neural Processing Letters, 1999, 9(3): 293-300.
  • 10[10]Scholkopf B, Smola A J, Williamson R C, et al. New support vector algorithms [J]. Neural Computation,2000, 12(5) :1207-1245.

共引文献131

同被引文献53

引证文献3

二级引证文献39

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部