期刊文献+

基于SVM的客车车型分类

Passenger Vehicles Type Classification Based on SVM
原文传递
导出
摘要 通过支持向量机(SVM)对客车车型的长,宽,高,宽长比等7个特征进行特征选择,得到的准确率最高的子集是长、宽、高、宽长比、宽高比,以它作为样本特征进行分类.对客车的4类车型进行分类,每类车型选择80个样本,50个样本进行训练,30个样本进行预测,结果表明:对1类车型的分类准确率可达到100%,对2类和4类车型可达到96%以上,对3类车可达到93%以上.得到了比选用长、宽、高作为特征进行分类更优的结果.然后运用加入参数寻优的SVM对客车的4类车型进行分类,并加以比较.基于高斯函数的特性,两次用到SVM进行机器学习时,核函数均选用RBF核函数. This paper use the SVM to choose the features which belong to Passenger vehicles, the feature include length,width,height and the ratio of them and so on.Then we get the best accuracy of the subset is consisted of length,width,height,width divided by length and width divided by height.Let the subset's features as the input and classify the samples.In this paper,we classify the passenger vehicles type which have 4 class,each type have 80 samples ,we use 50 samples for training and 30 samples for predicting.The results show that the first type classification accuracy can reach 100%,the second and third type can exceed 96% and the fourth type can exceed 93%,the result is better than the result which choose length,width,height as the features,then use the SVM which join the parameters optimization to classify the passenger car include 4 type and compare the result.Based on the characteristics of the gaussian function ,we have twice to use the SVM for machine learning,and choosing RBF Kernel Function as the kernel function.
机构地区 中北大学理学院
出处 《数学的实践与认识》 CSCD 北大核心 2012年第18期190-194,共5页 Mathematics in Practice and Theory
基金 2009年山西省自然科学研究基金(2009011018-3)
关键词 特征选择 C—SVC 核函数 参数寻优 feature selection C-SVC kernel function parameters optimization
  • 相关文献

参考文献5

二级参考文献50

  • 1P.G.J.Lisboa 邢春颖(译).现代神经网络应用[M].北京:电子工业出版社,1996..
  • 2陆系群 余英林.前馈神经网隐含层结点的动态删除法[J].控制理论与应用,1997,14(1).
  • 3[4]Suarez A,Lutsko J F.Globally optimal fuzzy decision tree for classification and regression.IEEE Transaction on Pattern Analysis and Machine Intelligence.1999 ;21 (12):1297-1311
  • 4[5]Tax D M J,Breukelen M,Duin R P W,et al.Combining multiple classifiers by a averaging or by multiplying.Pattern Recogniton,2000 ;33:1475-1485
  • 5Langley P.Seleetion of relevant features in machine learning[J].In:Proe.AAAI Fall Symposium on Relevanee,1994:140-144.
  • 6Langley P,Iba W.Average-case analysis of a nearest neighbour algorithm[C] //Proceedings of the Thirteenth International Joint Con-Ferenee on Artifieial Intelligence,1993:889-894.
  • 7Jain A,Zongker D.Feature seleetion:evaluation,application,and Sniall sample pedortnanee[J].IEEE transactions on pattern analysis and machine intelligence,1997,19(2):153-158.
  • 8Xing E,Jordan M,Karp R.Feature seleetion for high-dimensional genomic microarray data[C] //Intl.conf.on Machine Learning,2001:601-608.
  • 9Davies S,Russl S.Np-completeness of searehes for smallest Pos Sible feature sets[C] // In:Proc.Of the AAAI Fall 94Symposium on Relevanee,1994:37-39.
  • 10Narendra PM,Fukunaga K.A branch and bound algorithm for feature subset selection[J].IEEE Transactions on Computers,1997(26):917-922.

共引文献60

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部