期刊文献+

基于域理论的自适应谐振神经网络研究 被引量:3

RESEARCH OF FIELD THEORY BASED ADAPTIVE RESONANCE NEURAL NETWORK
在线阅读 下载PDF
导出
摘要 在自适应谐振理论和域理论的基础上提出了一种基于域理论的自适应谐振神经网络算法FTART .该算法将自适应谐振理论和域理论的优点有机结合 ,采用了独特的解决样本间冲突和动态扩大分类区域的方法 ,不需人为设置隐层神经元 ,学习速度快 ,精度高 此外 ,还提出了一种从FTART网络中抽取符号规则的方法 ,即基于统计的产生 -测试法 ,实验结果表明 ,使用该方法抽取出的符号规则可理解性好、预测精度高 。 Adaptive Resonance Theory (ART) is an important class of competitive neural learning models. The memory mode of these models is similar to those of life forms, and their memory capacity can increase as the learning instances increase. Moreover, ART models can perform real time online learning and work under dynamical environments. So, these models have promising application foreground. On the other hand, Field Theory is a class of relaxation models, which are the only neural models that need only one round training currently. And they are good heteroassociative classifiers, which have large memory capacities and can perform real time supervised learning with fast speed. In this paper, a new neural learning algorithm named FTART (Field Theory based Adaptive Resonance Theory), which organically combines the advantages of Adaptive Resonance Theory and Field Theory, is proposed. FTART employs a unique approach to solve the conflicts between instances and extend classification regions dynamically. It overcomes the disadvantage of traditional feed forward neural networks, which need users to set up hidden units, and achieves fast learning speed and strong generality. Benchmark tests show that FTART is far better than BP on both training time cost and predictive accuracy. Because artificial neural network has stupendous ability of generalizing and dealing with nonlinear problems, it gets standout achievements that traditional symbolic mechanism cannot attain in many domains. However, there exists an inherent disadvantage in ANNs. That is, concepts learned by ANNs are hard to understand, and it is difficult to give an explicit explanation for the reasoning process, because knowledge is represented using large assemblages of connection weights in the network. This has cumbered the understanding of the function of neural models, and set limit to their application in knowledge discovery and knowledge refinement. We can overcome this disadvantage if we can extract comprehensible symbolic rules from neural networks. Nowadays, more and more attentions have been paid to this field, and many fruits have been achieved. In this paper, we propose a method named SPT (Statistics based Producing and Testing), which comes from the view of functionality, to extract symbolic rules from trained FTART network. Experimental results show that the rules extracted through this method are comprehensible and accurate, and can commendably describe the function of original neural network.
出处 《南京大学学报(自然科学版)》 CAS CSCD 2000年第2期140-147,共8页 Journal of Nanjing University(Natural Science)
基金 国家自然科学基金!(No :6 9875 0 0 6 A) 江苏省自然科学基金!(No :BK990 36 )
关键词 机器学习 神经网络 自适应谐振理论 域理论 Machine Learning Neural Networks Adaptive Resonance Theory Field Theory Rule Extraction
  • 相关文献

参考文献3

二级参考文献2

共引文献12

同被引文献20

  • 1杨晓敏,罗立民,韦钰.人体白细胞自动分类方法与系统实现[J].计算机学报,1994,17(2):130-136. 被引量:16
  • 2Belew R K, Booker L B. Proeeeedings of the Fourth international Conference on Genetic Algorithms. San Mateo, CA: Morgan Kaufmann Publishers, Inc, 1991.
  • 3Schaffer J D. Procceedings of the Third International Conference on Genetic Algorithms. San Mateo,CA: Morgan kaufmann Publishers, Inc, 1989.
  • 4Zhou Z H, Chen S F, Chen Z Q. A statistics based approach for extracting priority rules from trained neural networks. Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. Italy: Como, 2000, 3: 401--406.
  • 5Judd J S. Learning in networks is hard. Proceedings of the 1st IEEE International Conference on Neural Networks, 1987, 2: 685--692.
  • 6Hornik K M, Stinchcombe M, White H. Multilayer feedforward networks are universal approximators.Neural Networks, 1989, 2(2): 359--366.
  • 7Lang K J, Waibel A H, Hinton G E. A time-delay network architecture for isolated word recognition.Neural Networks, 1990, 3(1): 23--44.
  • 8Lippmann R P. Pattern classification using neural networks. IEEE Communication Magzine, 1989, 27(11): 47--64.
  • 9Huang W M, Lippmann R P. Neural net and traditional classifiers. Anderson D. Neural Information Processing Systems. New York: American Institution of Physics, 1988:387--339.
  • 10Baum E B, Haussler D. What size net gives valid generalization? Nenral Computation, 1989, 1(1): 151-160.

引证文献3

二级引证文献82

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部