1[1]Schmidhuber J. Discovering neural nets with low Kolmogorov complexity and high generalization capability. Neural Networks, 1997,10(5): 857~873
2[2]Pearlmutter B A, Rosenfield R. Chaitin-Kolmogorov Complexity and Generalization in Neural Networks. In:Proc. Advances in Neural Information Processing Systems 3.San Mateo:1991,CA: 925~931
3[3]Weigend A S, Rumelhart D E, Huberman B A. Generalization by Weight-Elimination with Application to Forecasting.In:Proc. Advances in Neural Information Processing Systems 3. San Mateo 1991:CA: 875~882
4[4]Amari S, Murata N, Muller K. Asymptotic statistical theory of overtraining and cross-validation. IEEE Trans. Neural Networks,1997,8(5):985~996
7TAPAS KANUNGO,DAVID M,MOUNT,NATHAN S NETANYAHU, et al,A Eftlciem k -Means Clustering Algorithm:Analysis and lmplementation[J]. lEEE Tarmactions on Pattern Analysis and Machine Intelligence,2002,24(7) :881 -892.
8Baum E B,Haassler D.What size net gives valid generalization[A].NIPSI [C] .San Mateo,CA, 1989.81-90.
9Moody J E.The effective nuber of parameters:An analysis of generalization and regulrization in nonlinear learning system [ A ]. NIPS 4[ C]. San Mateo, CA, 1992.847 -854.
10Barron A R. Approximation and estimation bounds for artificial neural networks [J] .Machine Learning, 1994, (14) : 115 - 133.