期刊文献+

扩展的树增强朴素贝叶斯分类器 被引量:6

Extended Tree Augmented Naive Bayesian Classifier
原文传递
导出
摘要 树增强朴素贝叶斯分类器继承了朴素贝叶斯分类器计算简单和鲁棒性的特点,同时分类性能常常优于朴素贝叶斯分类器,然而在有连续变量的情况下要求必须进行预离散化。为了更好地表达数据的分布,减少信息损失。有必要考虑混合数据的情况。本文推导混合数据的极大似然函数,提出扩展的树增强朴素贝叶斯分类器,突破必须对连续变量进行预离散化的限制,能够在树增强朴素贝叶斯分类器的框架内处理混合变量的情况。实验测试证明其具有良好的分类精度。 Tree Augmented Naive Bayesian Classifier (TAN) often outperforms Naive Bayesian, yet at the same time maintains the computational simplicity and robustness that characterize Naive Bayesian. But TAN often requires a prior discretization of continuous variables. It is important to- investigate mixed-mode data, in order to represent data distributions well and avoid the problem of information loss. In this paper, the maximum likelihood function of hybrid data is deduced, and a new classifier called Extended Tree Augmented Naive Bayesian Classifier (ETAN) is put forward. The proposed classifier breaks through the restriction that continuous variables must be discretized, and it can deal with hybrid variables in the framework of TAN. Experiments show that this classifier has a good accuracy of classification.
出处 《模式识别与人工智能》 EI CSCD 北大核心 2006年第4期469-474,共6页 Pattern Recognition and Artificial Intelligence
基金 国家自然科学基金(No.70371026)
关键词 朴素贝叶斯分类器 学习贝叶斯网 树增强朴素贝叶斯分类器(TAN) 扩展的树增强朴素贝叶斯分类器(ETAN) Naive Bayesian Classifier, Learning Bayesian Networks, Tree Augmented Naive Bayesian Classifier (TAN), Extended Tree Augmented Naive Bayesian Classifier (ETAN)
  • 相关文献

参考文献13

  • 1Langley P, Iba W, Thompson K. An Analysis of Bayesian Classifiers. In: Proc of the 10th National Conference on Artificial Intelligence. San Jose, USA: AAAI Press, 1992, 223-228
  • 2Friedman N, Geiger D, Goldszmidt M. Bayesian Network Classifters. Machine Learning, 1997, 29(2-3): 131-163
  • 3Neapolitan R E. Learning Bayesian Networks. Upper Saddle River, USA: Prentice Hall, 2003
  • 4Chickering D M, Geiger D, Heckerman D. Learning Bayesian Networks is NP-Complete. In: Fisher D H, Lenz H J, eds.Learning from Data: Artificial Intelligence and Statistics. New York, USA: Springer-Verlag, 1996, 121-130
  • 5Geiger D. An Entropy-Based Learning Algorithm of Bayesian Conditional Trees. In:Proc of the 8th Annual Conference on Uncertainty in Artificial Intelligence. San Mateo, USA: Morgan Kaufmann, 1992, 92-97
  • 6Chow C K, Liu C N. Approximating Discrete Probability Distributions with Dependence Trees. IEEE Trans on Information Theory, 1968, 14(3): 462-467
  • 7Tarjan R E. Finding Optimal Branchings. Networks, 1977, 7:25-35
  • 8JohnsonRA WichernDW.实用多元统计分析[M].北京:清华大学出版社,2001..
  • 9Murphy P M, Aha D W. UCI Repository of Machine Learning Database. http://www.ics.uci.edu/-mlearn/MLRepository.html
  • 10Fayyad U M, Irani K B. Multi-Interval Discretization of Continuous-Valued Attributes for Classification Learning. In: Proc of the 13th International Joint Conference on Artificial Intelli-gence. San Mateo, USA: Morgan Kaufmann, 1993, 1022-1027

共引文献4

同被引文献80

引证文献6

二级引证文献18

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部