摘要
本文研究了批方式和序贯方式训练的外监督前馈网络的全局最小条件.对于序贯训练方式,我们证明在其对应误差表面上总是存在N(训练样本个数)个局部最小点;对于批处理训练方式,我们证明网络获得零代价全局最小解的充要条件是,外监督信号矩阵构成的列空间必须位于模式样本在隐层张开的矩阵的列空间内;而网络获得零代价全局最小解的充分条件是,隐节点数M大于或等于非吻合的样本数N.并且证明,在满足上面充要(分)条件下,所定义的误差代价函数对应的误差表面上不存在除零代价以外的任何局部最小值.进一步推知,若C≤M<N(C为输出节点数),则网络有可能获得零代价的全局最小解;若M<C≤N,则无论如何训练网络,网络也将不会收敛到零代价的全局最小解.
This paper studies the global minimum conditions of the outer supervised feedforward neural networks (FNN) in the batch style and sequence style trainings.For the sequence style training,it is showed that there always exist N local minimum points in the error surface.For the batch style training,it is showed the sufficient and necessary condition that FNNs can obtain global minimum solutions with null cost is that the range space R(Y) of the outer supervised signal matrix Y must be included in the range space R(X) of the hidden output matrix X ,i.e., R(Y)R(X) .In addition,it is also showed that the sufficient condition that the FNN can obtain global minimum solutions with null cost is M≥N .It should be specificially stressed that under the conditions of R(Y)R(X) and M≥N ,there will exist not any local minimum points in the error surface.Further,it follows that if C≤M<N it is possible for the FNNs to obtain global minimum solutions with null cost,if M<C≤M it is impossible for the FNNs to obtain global minimum solutions with null cost.
出处
《电子学报》
EI
CAS
CSCD
北大核心
1999年第4期98-101,共4页
Acta Electronica Sinica
基金
国家自然科学基金
关键词
前馈神经网络
批方式训练
序贯方式训练
网络
Feedforward neural networks,Batch style training,Sequence style training,Hessian,Global minimium solution,Local minimium solution,Embedded subspace