期刊文献+

关于多层前馈神经网络学习能力的一个注记 被引量:3

On the Learning Capability of Multi-layer Feedforward Neural Networks
原文传递
导出
摘要 给出一种通过适当地选择输入层与隐层间的连接权,来减少单隐层前馈型神经网络隐层节点的个数的方法.应用此方法,分析了具有两个隐层节点的标准单隐层网络的学习能力,并对二元和三元XOR问题中的权值的选择问题进行了详细的讨论. This paper shows a way of reducing the number of the hidden layer neurons of a single hidden layer feedforward networks by suitable choosing the weights connecting the input neurons and the hidden neurons.Some results on the learning capability of single hidden layer feedforward networks with two hidden neurons are given.The binary XOR and trinary XOR problem are discussed.
作者 田大钢
出处 《数学的实践与认识》 CSCD 北大核心 2009年第23期187-197,共11页 Mathematics in Practice and Theory
基金 上海市重点学科项目(S30504)
关键词 学习能力 神经网络结构 非线性分析 XOR问题 learning capability neural network size nonlinear analysis XOR
  • 相关文献

参考文献17

  • 1Poggio T, Girosi F. Regularization algorithms for learning that are equivalent to multilayer networks[J]. Science, 1990, 247: 978-982.
  • 2Scarselli F, Tsoi A C. Universal approximation using feedforward neural networks: A survey of some existing methods, and some newresults[J]. Neural Networks, 1998, 11, 15-37.
  • 3Tamura S, Tateishi M. Capabilities of a four-layered feedforward neural network: Four layers versus three[J]. IEEE Trans Neural Networks, 1997, 8: 251-255.
  • 4Sartori M A, et al. A simple method to derive bounds on the size and to train multilayer neural networks[J]. IEEE Trans Neural Networks, 1991, 2: 467-471.
  • 5Huang G B, Learning capability and storage capacity of two-hidden-layer feedforward networks[J], IEEE Trans Neural Networks, 2003, 14: 274-281.
  • 6Huang G B, Babri H A. Upper bounds on the number of hidden neurons in feedforward networks with arbitrary hounded nonlinear activation functions[J]. IEEE Trans Neural Networks, 1998, 9: 224-229.
  • 7Huang S C, Huang Y F. Bounds on number of hidden neurons in multilayer perceptrons[J]. IEEE Trans Neural Networks, 1991, 2: 47-55.
  • 8Baum E B, Hausslet D. What size net give valid generalization.? [J]. Neural Comput, 1989, 1 : 151-160.
  • 9田大钢.前馈神经网络的学习能力[J].系统工程理论与实践,2004,24(11):76-81. 被引量:3
  • 10Cheng Xiang, Shenqiang Q Ding, Tong Heng Lee. Geometrical interpretation and architecture selection of MLP[J]. IEEE Trans Neural Networks, 2005,16: 84-96.

二级参考文献1

共引文献4

同被引文献24

  • 1田大钢.前馈神经网络的学习能力[J].系统工程理论与实践,2004,24(11):76-81. 被引量:3
  • 2张松柏,田东风,伍钧,李金鸿.轻水堆燃料关联核素的特征比值与乏燃料堆型的辨别[J].原子能科学技术,2007,41(1):104-108. 被引量:1
  • 3S. M. Carrol and B. W. Dickinson, Construction of neural nets using the Random transform[ C]. Proceedings of the IEEE 1989 International Joint Conference on Neural Networks, 1989, 1:607 -611.
  • 4Vladimir N. Vapnik, Statistical Learning Theory[ M] John Wiley & Sons, Inc, 1998, 3:33 -52.
  • 5Martin Anthony, Peter L. Bartlett, Function learning from interpolation, combinatorics [ J ]. Probability and Computing, 2000, 9(3): 213-225.
  • 6Martin Anthony, Interpolation andlearning in artificial neural networks, in neural and computational learning [J]. NeuroCOLT 8556, 1996, 27:502-508.
  • 7G. B. Huang and H. A. Babri, Upper bounds on the number of hidden neurons in feedforward networks with ar- bitrary bounded nonlinear activation functions[J]. IEEE Trans. Neural Networks, 1998, 9:224-229.
  • 8G. B. Huang, Learning capability and storage capacity of two - hidden - layer feedforward networks [ J ]. IEEE Trans. Neural Networks, 2003, 14:274 - 281.
  • 9Touretzky D S, Plmerleau D A. What is hidden in the hidden layers? [J]. Byte, 1989, 14:227 -233.
  • 10Scarselli F, Tsoi A C. Universal approximation using feedforward neural networks: A survey of some existing method, and some new resuhs[J]. Neural Networks, 1998, 11:15 -37.

引证文献3

二级引证文献3

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部